[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8240 1726773021.15741: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-EI7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8240 1726773021.16053: Added group all to inventory 8240 1726773021.16054: Added group ungrouped to inventory 8240 1726773021.16058: Group all now contains ungrouped 8240 1726773021.16060: Examining possible inventory source: /tmp/kernel_settings-PVh/inventory.yml 8240 1726773021.25117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8240 1726773021.25165: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8240 1726773021.25181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8240 1726773021.25225: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8240 1726773021.25276: Loaded config def from plugin (inventory/script) 8240 1726773021.25277: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8240 1726773021.25307: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8240 1726773021.25367: Loaded config def from plugin (inventory/yaml) 8240 1726773021.25369: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8240 1726773021.25430: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8240 1726773021.25714: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8240 1726773021.25716: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8240 1726773021.25719: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8240 1726773021.25723: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8240 1726773021.25726: Loading data from /tmp/kernel_settings-PVh/inventory.yml 8240 1726773021.25769: /tmp/kernel_settings-PVh/inventory.yml was not parsable by auto 8240 1726773021.25814: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8240 1726773021.25842: Loading data from /tmp/kernel_settings-PVh/inventory.yml 8240 1726773021.25901: group all already in inventory 8240 1726773021.25906: set inventory_file for managed_node1 8240 1726773021.25909: set inventory_dir for managed_node1 8240 1726773021.25909: Added host managed_node1 to inventory 8240 1726773021.25911: Added host managed_node1 to group all 8240 1726773021.25911: set ansible_host for managed_node1 8240 1726773021.25912: set ansible_ssh_extra_args for managed_node1 8240 1726773021.25914: set inventory_file for managed_node2 8240 1726773021.25915: set inventory_dir for managed_node2 8240 1726773021.25916: Added host managed_node2 to inventory 8240 1726773021.25916: Added host managed_node2 to group all 8240 1726773021.25917: set ansible_host for managed_node2 8240 1726773021.25917: set ansible_ssh_extra_args for managed_node2 8240 1726773021.25919: set inventory_file for managed_node3 8240 1726773021.25920: set inventory_dir for managed_node3 8240 1726773021.25920: Added host managed_node3 to inventory 8240 1726773021.25921: Added host managed_node3 to group all 8240 1726773021.25922: set ansible_host for managed_node3 8240 1726773021.25922: set ansible_ssh_extra_args for managed_node3 8240 1726773021.25924: Reconcile groups and hosts in inventory. 8240 1726773021.25927: Group ungrouped now contains managed_node1 8240 1726773021.25928: Group ungrouped now contains managed_node2 8240 1726773021.25929: Group ungrouped now contains managed_node3 8240 1726773021.25983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8240 1726773021.26068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8240 1726773021.26101: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8240 1726773021.26120: Loaded config def from plugin (vars/host_group_vars) 8240 1726773021.26122: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8240 1726773021.26127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8240 1726773021.26132: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8240 1726773021.26161: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8240 1726773021.26405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773021.26472: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8240 1726773021.26497: Loaded config def from plugin (connection/local) 8240 1726773021.26499: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8240 1726773021.26831: Loaded config def from plugin (connection/paramiko_ssh) 8240 1726773021.26833: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8240 1726773021.27434: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8240 1726773021.27459: Loaded config def from plugin (connection/psrp) 8240 1726773021.27461: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8240 1726773021.27890: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8240 1726773021.27913: Loaded config def from plugin (connection/ssh) 8240 1726773021.27915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8240 1726773021.29112: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8240 1726773021.29136: Loaded config def from plugin (connection/winrm) 8240 1726773021.29138: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8240 1726773021.29164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8240 1726773021.29209: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8240 1726773021.29246: Loaded config def from plugin (shell/cmd) 8240 1726773021.29248: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8240 1726773021.29268: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8240 1726773021.29307: Loaded config def from plugin (shell/powershell) 8240 1726773021.29308: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8240 1726773021.29343: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8240 1726773021.29447: Loaded config def from plugin (shell/sh) 8240 1726773021.29448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8240 1726773021.29474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8240 1726773021.29548: Loaded config def from plugin (become/runas) 8240 1726773021.29549: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8240 1726773021.29662: Loaded config def from plugin (become/su) 8240 1726773021.29663: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8240 1726773021.29760: Loaded config def from plugin (become/sudo) 8240 1726773021.29762: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8240 1726773021.29787: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8240 1726773021.30426: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8240 1726773021.32458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8240 1726773021.32491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8240 1726773021.34144: in VariableManager get_vars() 8240 1726773021.34165: done with get_vars() 8240 1726773021.34201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8240 1726773021.34210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8240 1726773021.34386: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8240 1726773021.34479: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8240 1726773021.34481: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 8240 1726773021.34506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8240 1726773021.34522: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8240 1726773021.34622: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8240 1726773021.34658: Loaded config def from plugin (callback/default) 8240 1726773021.34659: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8240 1726773021.35431: Loaded config def from plugin (callback/junit) 8240 1726773021.35433: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8240 1726773021.35468: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8240 1726773021.35509: Loaded config def from plugin (callback/minimal) 8240 1726773021.35510: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8240 1726773021.35538: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8240 1726773021.35580: Loaded config def from plugin (callback/tree) 8240 1726773021.35582: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8240 1726773021.35657: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8240 1726773021.35659: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_change_settings.yml ******************************************** 1 plays in /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml 8240 1726773021.35678: in VariableManager get_vars() 8240 1726773021.35692: done with get_vars() 8240 1726773021.35696: in VariableManager get_vars() 8240 1726773021.35702: done with get_vars() 8240 1726773021.35704: variable 'omit' from source: magic vars 8240 1726773021.35731: in VariableManager get_vars() 8240 1726773021.35741: done with get_vars() 8240 1726773021.35758: variable 'omit' from source: magic vars PLAY [Test changing settings] ************************************************** 8240 1726773021.37696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8240 1726773021.37753: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8240 1726773021.37779: getting the remaining hosts for this loop 8240 1726773021.37780: done getting the remaining hosts for this loop 8240 1726773021.37782: getting the next task for host managed_node2 8240 1726773021.37787: done getting next task for host managed_node2 8240 1726773021.37792: ^ task is: TASK: Gathering Facts 8240 1726773021.37793: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773021.37794: getting variables 8240 1726773021.37795: in VariableManager get_vars() 8240 1726773021.37802: Calling all_inventory to load vars for managed_node2 8240 1726773021.37804: Calling groups_inventory to load vars for managed_node2 8240 1726773021.37806: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773021.37815: Calling all_plugins_play to load vars for managed_node2 8240 1726773021.37821: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773021.37823: Calling groups_plugins_play to load vars for managed_node2 8240 1726773021.37846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773021.37880: done with get_vars() 8240 1726773021.37886: done getting variables 8240 1726773021.37936: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.023) 0:00:00.023 **** 8240 1726773021.37951: entering _queue_task() for managed_node2/gather_facts 8240 1726773021.37953: Creating lock for gather_facts 8240 1726773021.38151: worker is 1 (out of 1 available) 8240 1726773021.38165: exiting _queue_task() for managed_node2/gather_facts 8240 1726773021.38176: done queuing things up, now waiting for results queue to drain 8240 1726773021.38178: waiting for pending results... 8243 1726773021.38256: running TaskExecutor() for managed_node2/TASK: Gathering Facts 8243 1726773021.38353: in run() - task 0affffe7-6841-885f-bbcf-00000000002f 8243 1726773021.38370: variable 'ansible_search_path' from source: unknown 8243 1726773021.38400: calling self._execute() 8243 1726773021.38491: variable 'ansible_host' from source: host vars for 'managed_node2' 8243 1726773021.38500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8243 1726773021.38508: variable 'omit' from source: magic vars 8243 1726773021.38576: variable 'omit' from source: magic vars 8243 1726773021.38598: variable 'omit' from source: magic vars 8243 1726773021.38625: variable 'omit' from source: magic vars 8243 1726773021.38660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8243 1726773021.38688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8243 1726773021.38705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8243 1726773021.38721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8243 1726773021.38732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8243 1726773021.38756: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8243 1726773021.38761: variable 'ansible_host' from source: host vars for 'managed_node2' 8243 1726773021.38763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8243 1726773021.38828: Set connection var ansible_pipelining to False 8243 1726773021.38835: Set connection var ansible_timeout to 10 8243 1726773021.38840: Set connection var ansible_module_compression to ZIP_DEFLATED 8243 1726773021.38842: Set connection var ansible_shell_type to sh 8243 1726773021.38845: Set connection var ansible_shell_executable to /bin/sh 8243 1726773021.38848: Set connection var ansible_connection to ssh 8243 1726773021.38863: variable 'ansible_shell_executable' from source: unknown 8243 1726773021.38866: variable 'ansible_connection' from source: unknown 8243 1726773021.38869: variable 'ansible_module_compression' from source: unknown 8243 1726773021.38871: variable 'ansible_shell_type' from source: unknown 8243 1726773021.38873: variable 'ansible_shell_executable' from source: unknown 8243 1726773021.38875: variable 'ansible_host' from source: host vars for 'managed_node2' 8243 1726773021.38877: variable 'ansible_pipelining' from source: unknown 8243 1726773021.38879: variable 'ansible_timeout' from source: unknown 8243 1726773021.38881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8243 1726773021.39011: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8243 1726773021.39022: variable 'omit' from source: magic vars 8243 1726773021.39027: starting attempt loop 8243 1726773021.39031: running the handler 8243 1726773021.39043: variable 'ansible_facts' from source: unknown 8243 1726773021.39059: _low_level_execute_command(): starting 8243 1726773021.39068: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8243 1726773021.41346: stderr chunk (state=2): >>>Warning: Permanently added '10.31.9.64' (ECDSA) to the list of known hosts. <<< 8243 1726773021.54304: stdout chunk (state=3): >>>/root <<< 8243 1726773021.54742: stderr chunk (state=3): >>><<< 8243 1726773021.54750: stdout chunk (state=3): >>><<< 8243 1726773021.54771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.9.64' (ECDSA) to the list of known hosts. 8243 1726773021.54786: _low_level_execute_command(): starting 8243 1726773021.54795: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238 `" && echo ansible-tmp-1726773021.5477896-8243-158309371135238="` echo /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238 `" ) && sleep 0' 8243 1726773021.57234: stdout chunk (state=2): >>>ansible-tmp-1726773021.5477896-8243-158309371135238=/root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238 <<< 8243 1726773021.57366: stderr chunk (state=3): >>><<< 8243 1726773021.57373: stdout chunk (state=3): >>><<< 8243 1726773021.57390: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773021.5477896-8243-158309371135238=/root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238 , stderr= 8243 1726773021.57414: variable 'ansible_module_compression' from source: unknown 8243 1726773021.57457: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8243 1726773021.57464: ANSIBALLZ: Acquiring lock 8243 1726773021.57468: ANSIBALLZ: Lock acquired: 139787572477392 8243 1726773021.57473: ANSIBALLZ: Creating module 8243 1726773021.87240: ANSIBALLZ: Writing module into payload 8243 1726773021.87429: ANSIBALLZ: Writing module 8243 1726773021.87457: ANSIBALLZ: Renaming module 8243 1726773021.87464: ANSIBALLZ: Done creating module 8243 1726773021.87501: variable 'ansible_facts' from source: unknown 8243 1726773021.87510: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8243 1726773021.87521: _low_level_execute_command(): starting 8243 1726773021.87528: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8243 1726773021.90031: stdout chunk (state=2): >>>PLATFORM <<< 8243 1726773021.90172: stdout chunk (state=3): >>>Linux <<< 8243 1726773021.90221: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND <<< 8243 1726773021.90404: stderr chunk (state=3): >>><<< 8243 1726773021.90411: stdout chunk (state=3): >>><<< 8243 1726773021.90430: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 8243 1726773021.90438 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 8243 1726773021.90487: _low_level_execute_command(): starting 8243 1726773021.90495: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8243 1726773021.91109: Sending initial data 8243 1726773021.91116: Sent initial data (1234 bytes) 8243 1726773021.94823: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8243 1726773021.95278: stderr chunk (state=3): >>><<< 8243 1726773021.95291: stdout chunk (state=3): >>><<< 8243 1726773021.95308: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8243 1726773021.95380: variable 'ansible_facts' from source: unknown 8243 1726773021.95387: variable 'ansible_facts' from source: unknown 8243 1726773021.95400: variable 'ansible_module_compression' from source: unknown 8243 1726773021.95442: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8243 1726773021.95477: variable 'ansible_facts' from source: unknown 8243 1726773021.95704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/AnsiballZ_setup.py 8243 1726773021.96275: Sending initial data 8243 1726773021.96283: Sent initial data (152 bytes) 8243 1726773021.99293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp1znbmrto /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/AnsiballZ_setup.py <<< 8243 1726773022.02302: stderr chunk (state=3): >>><<< 8243 1726773022.02314: stdout chunk (state=3): >>><<< 8243 1726773022.02344: done transferring module to remote 8243 1726773022.02361: _low_level_execute_command(): starting 8243 1726773022.02368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/ /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/AnsiballZ_setup.py && sleep 0' 8243 1726773022.05004: stderr chunk (state=2): >>><<< 8243 1726773022.05015: stdout chunk (state=2): >>><<< 8243 1726773022.05033: _low_level_execute_command() done: rc=0, stdout=, stderr= 8243 1726773022.05038: _low_level_execute_command(): starting 8243 1726773022.05046: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/AnsiballZ_setup.py && sleep 0' 8243 1726773023.44654: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMo3DZDg5lt06Jrw0Qd9X74dNy/nBSXaUtMMS052uVgTKSXm0tKCx2zaAxNgM505uL6pZUEUOZeRR3Z/LNdlNSBT8r+2LnUpqcmdODxoyccRSKqmwLK1zJVzwSXZ+AjBD3x9gTlBYQayaOpqR1f05hNnHy2R3kxXoB1tNNpqpz3bAAAAFQCELYwNT97+ZXrdhwhMhoA7GWXL9QAAAIAtG2SRvcGWlL2z5hFtYYMsg8GRtVOEKlX108Ws20I7sI95Nm0WYvTIwFqYPINzLfCA+Ls/dLGPq2G5YUvm7QgMzmHhsK9TJhhd889W4OzyNzFL2GT6B86x7dZphalrTs/0syAVSP84E66QTj7TJU/HFsVowFD4iq3yKCBHZJJADAAAAIEAhWG94qCQeDFTxKLHPtQNkV9HI8hfIJXDM0pIL2n7yQ4TU9nWEOQtJjRFpp8k2NZ14U2EHZ7RelwhzfDZFiBK/2NQ+JoDjS0bFevNKG07tHLq+FXOxS5Gysh8BpFPLhRgxptusyg4njXv6abAem4QO5Gnikd8Ctf4orp3mf5Mo/w=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCpwL8CyExbkE87G4Z2ELO87C73MrB52pvN1/REWnLK8RyqlqY8GHCeODUP+lN9RROaKP+1mUq3R5P1vUyf0NUVZoeIitePd6dIY/HwffaeTzXLBp5sMcjPisFL9fVo1g9PkYZwmRgL4IDj39kSp4ttnuRynttW4g2Rs1HM67H6KkzKpM6kihrSGu78vUz+DUKL+CHSg8G9JAZwNYy4MIhlxZCBD6JVscaKv4UDDIKGaxur3MJxlFE5md8KVyzl2k+WSa/7XfMN2st+rOPN7S0/rxSQCqHrUjPlqorz9aGTRlP4RZAYaDtqE4c90/EHeAATAfCsJhdktOAD9qONVrn3xGVN5xCXGmMfYLZ45DqZVWZ4YX5ZyL6QgAb+85FH2gkWOHTqYMI6TEV7e1J7AWXpkqKygVZtILvPsrKCUYHLORO3bEdTWm5PcqLwhzTi7Myybh+twLfLdY2Yz1rPuCkWuI/Cz5RdyJxNeH9XbnhvF+7lVXY/7xuObH99wqWSFCE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAWpvYoviifCdAjKvSxQ8CBgYzbKEPHp2fMY65o4pwBevyZghxLsKaAsi+dFVlgZs0/UVAVgvbOXtdqsH0tvoHQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-n<<< 8243 1726773023.44720: stdout chunk (state=3): >>>istp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBNujYY/kuSC3n5Sb7T5pTC/SxbGKraWJ1B8z8Tcma+S", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-64", "ansible_nodename": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "df37f0c23b234636ab118236eb740b01", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2716, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 823, "free": 2716}, "nocache": {"free": 3303, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2129e7-60e3-3c02-08d8-26f837ba4057", "ansible_product_uuid": "ec2129e7-60e3-3c02-08d8-26f837ba4057", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 418, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263504470016, "block_size": 4096, "block_total": 65533179, "block_available": 64332146, "block_used": 1201033, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_hostnqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.35, "5m": 0.2, "15m": 0.1}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:e8:f7:4d:72:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.64", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10e8:f7ff:fe4d:72f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.64", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:e8:f7:4d:72:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.64"], "ansible_all_ipv6_addresses": ["fe80::10e8:f7ff:fe4d:72f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.64", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10e8:f7ff:fe4d:72f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 38020 10.31.9.64 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 38020 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "23", "epoch": "1726773023", "epoch_int": "1726773023", "date": "2024-09-19", "time": "15:10:23", "iso8601_micro": "2024-09-19T19:10:23.440502Z", "iso8601": "2024-09-19T19:10:23Z", "iso8601_basic": "20240919T151023440502", "iso8601_basic_short": "20240919T151023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8243 1726773023.46387: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8243 1726773023.46398: stdout chunk (state=3): >>><<< 8243 1726773023.46410: stderr chunk (state=3): >>><<< 8243 1726773023.46437: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMo3DZDg5lt06Jrw0Qd9X74dNy/nBSXaUtMMS052uVgTKSXm0tKCx2zaAxNgM505uL6pZUEUOZeRR3Z/LNdlNSBT8r+2LnUpqcmdODxoyccRSKqmwLK1zJVzwSXZ+AjBD3x9gTlBYQayaOpqR1f05hNnHy2R3kxXoB1tNNpqpz3bAAAAFQCELYwNT97+ZXrdhwhMhoA7GWXL9QAAAIAtG2SRvcGWlL2z5hFtYYMsg8GRtVOEKlX108Ws20I7sI95Nm0WYvTIwFqYPINzLfCA+Ls/dLGPq2G5YUvm7QgMzmHhsK9TJhhd889W4OzyNzFL2GT6B86x7dZphalrTs/0syAVSP84E66QTj7TJU/HFsVowFD4iq3yKCBHZJJADAAAAIEAhWG94qCQeDFTxKLHPtQNkV9HI8hfIJXDM0pIL2n7yQ4TU9nWEOQtJjRFpp8k2NZ14U2EHZ7RelwhzfDZFiBK/2NQ+JoDjS0bFevNKG07tHLq+FXOxS5Gysh8BpFPLhRgxptusyg4njXv6abAem4QO5Gnikd8Ctf4orp3mf5Mo/w=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCpwL8CyExbkE87G4Z2ELO87C73MrB52pvN1/REWnLK8RyqlqY8GHCeODUP+lN9RROaKP+1mUq3R5P1vUyf0NUVZoeIitePd6dIY/HwffaeTzXLBp5sMcjPisFL9fVo1g9PkYZwmRgL4IDj39kSp4ttnuRynttW4g2Rs1HM67H6KkzKpM6kihrSGu78vUz+DUKL+CHSg8G9JAZwNYy4MIhlxZCBD6JVscaKv4UDDIKGaxur3MJxlFE5md8KVyzl2k+WSa/7XfMN2st+rOPN7S0/rxSQCqHrUjPlqorz9aGTRlP4RZAYaDtqE4c90/EHeAATAfCsJhdktOAD9qONVrn3xGVN5xCXGmMfYLZ45DqZVWZ4YX5ZyL6QgAb+85FH2gkWOHTqYMI6TEV7e1J7AWXpkqKygVZtILvPsrKCUYHLORO3bEdTWm5PcqLwhzTi7Myybh+twLfLdY2Yz1rPuCkWuI/Cz5RdyJxNeH9XbnhvF+7lVXY/7xuObH99wqWSFCE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAWpvYoviifCdAjKvSxQ8CBgYzbKEPHp2fMY65o4pwBevyZghxLsKaAsi+dFVlgZs0/UVAVgvbOXtdqsH0tvoHQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBNujYY/kuSC3n5Sb7T5pTC/SxbGKraWJ1B8z8Tcma+S", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-64", "ansible_nodename": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "df37f0c23b234636ab118236eb740b01", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2716, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 823, "free": 2716}, "nocache": {"free": 3303, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2129e7-60e3-3c02-08d8-26f837ba4057", "ansible_product_uuid": "ec2129e7-60e3-3c02-08d8-26f837ba4057", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 418, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263504470016, "block_size": 4096, "block_total": 65533179, "block_available": 64332146, "block_used": 1201033, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_hostnqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.35, "5m": 0.2, "15m": 0.1}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:e8:f7:4d:72:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.64", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10e8:f7ff:fe4d:72f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.64", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:e8:f7:4d:72:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.64"], "ansible_all_ipv6_addresses": ["fe80::10e8:f7ff:fe4d:72f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.64", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10e8:f7ff:fe4d:72f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 38020 10.31.9.64 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 38020 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "23", "epoch": "1726773023", "epoch_int": "1726773023", "date": "2024-09-19", "time": "15:10:23", "iso8601_micro": "2024-09-19T19:10:23.440502Z", "iso8601": "2024-09-19T19:10:23Z", "iso8601_basic": "20240919T151023440502", "iso8601_basic_short": "20240919T151023", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.9.64 closed. 8243 1726773023.46815: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8243 1726773023.46843: _low_level_execute_command(): starting 8243 1726773023.46850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773021.5477896-8243-158309371135238/ > /dev/null 2>&1 && sleep 0' 8243 1726773023.49473: stderr chunk (state=2): >>><<< 8243 1726773023.49487: stdout chunk (state=2): >>><<< 8243 1726773023.49506: _low_level_execute_command() done: rc=0, stdout=, stderr= 8243 1726773023.49515: handler run complete 8243 1726773023.49636: variable 'ansible_facts' from source: unknown 8243 1726773023.49735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8243 1726773023.50045: variable 'ansible_facts' from source: unknown 8243 1726773023.50126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8243 1726773023.50250: attempt loop complete, returning result 8243 1726773023.50259: _execute() done 8243 1726773023.50264: dumping result to json 8243 1726773023.50292: done dumping result, returning 8243 1726773023.50302: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affffe7-6841-885f-bbcf-00000000002f] 8243 1726773023.50308: sending task result for task 0affffe7-6841-885f-bbcf-00000000002f 8243 1726773023.50526: done sending task result for task 0affffe7-6841-885f-bbcf-00000000002f 8243 1726773023.50530: WORKER PROCESS EXITING ok: [managed_node2] 8240 1726773023.51226: no more pending results, returning what we have 8240 1726773023.51229: results queue empty 8240 1726773023.51230: checking for any_errors_fatal 8240 1726773023.51232: done checking for any_errors_fatal 8240 1726773023.51233: checking for max_fail_percentage 8240 1726773023.51234: done checking for max_fail_percentage 8240 1726773023.51234: checking to see if all hosts have failed and the running result is not ok 8240 1726773023.51235: done checking to see if all hosts have failed 8240 1726773023.51236: getting the remaining hosts for this loop 8240 1726773023.51237: done getting the remaining hosts for this loop 8240 1726773023.51240: getting the next task for host managed_node2 8240 1726773023.51245: done getting next task for host managed_node2 8240 1726773023.51246: ^ task is: TASK: meta (flush_handlers) 8240 1726773023.51248: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773023.51251: getting variables 8240 1726773023.51252: in VariableManager get_vars() 8240 1726773023.51275: Calling all_inventory to load vars for managed_node2 8240 1726773023.51277: Calling groups_inventory to load vars for managed_node2 8240 1726773023.51280: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773023.51290: Calling all_plugins_play to load vars for managed_node2 8240 1726773023.51292: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773023.51295: Calling groups_plugins_play to load vars for managed_node2 8240 1726773023.51473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773023.51680: done with get_vars() 8240 1726773023.51692: done getting variables 8240 1726773023.51761: in VariableManager get_vars() 8240 1726773023.51770: Calling all_inventory to load vars for managed_node2 8240 1726773023.51773: Calling groups_inventory to load vars for managed_node2 8240 1726773023.51775: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773023.51780: Calling all_plugins_play to load vars for managed_node2 8240 1726773023.51782: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773023.51786: Calling groups_plugins_play to load vars for managed_node2 8240 1726773023.51920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773023.52106: done with get_vars() 8240 1726773023.52120: done queuing things up, now waiting for results queue to drain 8240 1726773023.52122: results queue empty 8240 1726773023.52123: checking for any_errors_fatal 8240 1726773023.52126: done checking for any_errors_fatal 8240 1726773023.52127: checking for max_fail_percentage 8240 1726773023.52128: done checking for max_fail_percentage 8240 1726773023.52128: checking to see if all hosts have failed and the running result is not ok 8240 1726773023.52129: done checking to see if all hosts have failed 8240 1726773023.52130: getting the remaining hosts for this loop 8240 1726773023.52130: done getting the remaining hosts for this loop 8240 1726773023.52133: getting the next task for host managed_node2 8240 1726773023.52138: done getting next task for host managed_node2 8240 1726773023.52140: ^ task is: TASK: Check if system is ostree 8240 1726773023.52143: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773023.52145: getting variables 8240 1726773023.52145: in VariableManager get_vars() 8240 1726773023.52153: Calling all_inventory to load vars for managed_node2 8240 1726773023.52158: Calling groups_inventory to load vars for managed_node2 8240 1726773023.52160: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773023.52165: Calling all_plugins_play to load vars for managed_node2 8240 1726773023.52167: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773023.52170: Calling groups_plugins_play to load vars for managed_node2 8240 1726773023.52305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773023.52486: done with get_vars() 8240 1726773023.52494: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:12 Thursday 19 September 2024 15:10:23 -0400 (0:00:02.146) 0:00:02.169 **** 8240 1726773023.52573: entering _queue_task() for managed_node2/stat 8240 1726773023.52822: worker is 1 (out of 1 available) 8240 1726773023.52838: exiting _queue_task() for managed_node2/stat 8240 1726773023.52851: done queuing things up, now waiting for results queue to drain 8240 1726773023.52853: waiting for pending results... 8313 1726773023.53064: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 8313 1726773023.53195: in run() - task 0affffe7-6841-885f-bbcf-000000000007 8313 1726773023.53214: variable 'ansible_search_path' from source: unknown 8313 1726773023.53247: calling self._execute() 8313 1726773023.53317: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726773023.53327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726773023.53337: variable 'omit' from source: magic vars 8313 1726773023.53778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8313 1726773023.54040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8313 1726773023.54088: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8313 1726773023.54130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8313 1726773023.54167: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8313 1726773023.54248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8313 1726773023.54279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8313 1726773023.54333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8313 1726773023.54363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8313 1726773023.54489: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8313 1726773023.54507: variable 'omit' from source: magic vars 8313 1726773023.54544: variable 'omit' from source: magic vars 8313 1726773023.54584: variable 'omit' from source: magic vars 8313 1726773023.54612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8313 1726773023.54639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8313 1726773023.54661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8313 1726773023.54678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8313 1726773023.54691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8313 1726773023.54720: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8313 1726773023.54728: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726773023.54732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726773023.54828: Set connection var ansible_pipelining to False 8313 1726773023.54836: Set connection var ansible_timeout to 10 8313 1726773023.54844: Set connection var ansible_module_compression to ZIP_DEFLATED 8313 1726773023.54848: Set connection var ansible_shell_type to sh 8313 1726773023.54853: Set connection var ansible_shell_executable to /bin/sh 8313 1726773023.54861: Set connection var ansible_connection to ssh 8313 1726773023.54881: variable 'ansible_shell_executable' from source: unknown 8313 1726773023.54888: variable 'ansible_connection' from source: unknown 8313 1726773023.54892: variable 'ansible_module_compression' from source: unknown 8313 1726773023.54895: variable 'ansible_shell_type' from source: unknown 8313 1726773023.54897: variable 'ansible_shell_executable' from source: unknown 8313 1726773023.54900: variable 'ansible_host' from source: host vars for 'managed_node2' 8313 1726773023.54904: variable 'ansible_pipelining' from source: unknown 8313 1726773023.54907: variable 'ansible_timeout' from source: unknown 8313 1726773023.54911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8313 1726773023.55044: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8313 1726773023.55058: variable 'omit' from source: magic vars 8313 1726773023.55064: starting attempt loop 8313 1726773023.55068: running the handler 8313 1726773023.55081: _low_level_execute_command(): starting 8313 1726773023.55091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8313 1726773023.57852: stdout chunk (state=2): >>>/root <<< 8313 1726773023.58266: stderr chunk (state=3): >>><<< 8313 1726773023.58276: stdout chunk (state=3): >>><<< 8313 1726773023.58301: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8313 1726773023.58318: _low_level_execute_command(): starting 8313 1726773023.58324: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512 `" && echo ansible-tmp-1726773023.583114-8313-178704780190512="` echo /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512 `" ) && sleep 0' 8313 1726773023.62035: stdout chunk (state=2): >>>ansible-tmp-1726773023.583114-8313-178704780190512=/root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512 <<< 8313 1726773023.62231: stderr chunk (state=3): >>><<< 8313 1726773023.62240: stdout chunk (state=3): >>><<< 8313 1726773023.62260: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773023.583114-8313-178704780190512=/root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512 , stderr= 8313 1726773023.62307: variable 'ansible_module_compression' from source: unknown 8313 1726773023.62372: ANSIBALLZ: Using lock for stat 8313 1726773023.62378: ANSIBALLZ: Acquiring lock 8313 1726773023.62381: ANSIBALLZ: Lock acquired: 139787572728976 8313 1726773023.62384: ANSIBALLZ: Creating module 8313 1726773023.71754: ANSIBALLZ: Writing module into payload 8313 1726773023.71841: ANSIBALLZ: Writing module 8313 1726773023.71863: ANSIBALLZ: Renaming module 8313 1726773023.71870: ANSIBALLZ: Done creating module 8313 1726773023.71888: variable 'ansible_facts' from source: unknown 8313 1726773023.71950: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/AnsiballZ_stat.py 8313 1726773023.72054: Sending initial data 8313 1726773023.72064: Sent initial data (150 bytes) 8313 1726773023.74670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp7onehe3_ /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/AnsiballZ_stat.py <<< 8313 1726773023.75772: stderr chunk (state=3): >>><<< 8313 1726773023.75780: stdout chunk (state=3): >>><<< 8313 1726773023.75801: done transferring module to remote 8313 1726773023.75818: _low_level_execute_command(): starting 8313 1726773023.75825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/ /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/AnsiballZ_stat.py && sleep 0' 8313 1726773023.78234: stderr chunk (state=2): >>><<< 8313 1726773023.78244: stdout chunk (state=2): >>><<< 8313 1726773023.78263: _low_level_execute_command() done: rc=0, stdout=, stderr= 8313 1726773023.78268: _low_level_execute_command(): starting 8313 1726773023.78274: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/AnsiballZ_stat.py && sleep 0' 8313 1726773023.93319: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8313 1726773023.94507: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8313 1726773023.94519: stdout chunk (state=3): >>><<< 8313 1726773023.94531: stderr chunk (state=3): >>><<< 8313 1726773023.94545: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 8313 1726773023.94589: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8313 1726773023.94601: _low_level_execute_command(): starting 8313 1726773023.94607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773023.583114-8313-178704780190512/ > /dev/null 2>&1 && sleep 0' 8313 1726773023.97297: stderr chunk (state=2): >>><<< 8313 1726773023.97311: stdout chunk (state=2): >>><<< 8313 1726773023.97328: _low_level_execute_command() done: rc=0, stdout=, stderr= 8313 1726773023.97335: handler run complete 8313 1726773023.97354: attempt loop complete, returning result 8313 1726773023.97358: _execute() done 8313 1726773023.97361: dumping result to json 8313 1726773023.97366: done dumping result, returning 8313 1726773023.97373: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affffe7-6841-885f-bbcf-000000000007] 8313 1726773023.97379: sending task result for task 0affffe7-6841-885f-bbcf-000000000007 8313 1726773023.97415: done sending task result for task 0affffe7-6841-885f-bbcf-000000000007 8313 1726773023.97420: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8240 1726773023.97825: no more pending results, returning what we have 8240 1726773023.97828: results queue empty 8240 1726773023.97829: checking for any_errors_fatal 8240 1726773023.97831: done checking for any_errors_fatal 8240 1726773023.97831: checking for max_fail_percentage 8240 1726773023.97833: done checking for max_fail_percentage 8240 1726773023.97833: checking to see if all hosts have failed and the running result is not ok 8240 1726773023.97834: done checking to see if all hosts have failed 8240 1726773023.97835: getting the remaining hosts for this loop 8240 1726773023.97836: done getting the remaining hosts for this loop 8240 1726773023.97839: getting the next task for host managed_node2 8240 1726773023.97844: done getting next task for host managed_node2 8240 1726773023.97846: ^ task is: TASK: Set flag to indicate system is ostree 8240 1726773023.97848: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773023.97851: getting variables 8240 1726773023.97852: in VariableManager get_vars() 8240 1726773023.97877: Calling all_inventory to load vars for managed_node2 8240 1726773023.97880: Calling groups_inventory to load vars for managed_node2 8240 1726773023.97883: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773023.97894: Calling all_plugins_play to load vars for managed_node2 8240 1726773023.97897: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773023.97900: Calling groups_plugins_play to load vars for managed_node2 8240 1726773023.98058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773023.98248: done with get_vars() 8240 1726773023.98258: done getting variables 8240 1726773023.98347: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:17 Thursday 19 September 2024 15:10:23 -0400 (0:00:00.457) 0:00:02.627 **** 8240 1726773023.98374: entering _queue_task() for managed_node2/set_fact 8240 1726773023.98376: Creating lock for set_fact 8240 1726773023.98573: worker is 1 (out of 1 available) 8240 1726773023.98588: exiting _queue_task() for managed_node2/set_fact 8240 1726773023.98601: done queuing things up, now waiting for results queue to drain 8240 1726773023.98603: waiting for pending results... 8329 1726773023.98858: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 8329 1726773023.98977: in run() - task 0affffe7-6841-885f-bbcf-000000000008 8329 1726773023.98996: variable 'ansible_search_path' from source: unknown 8329 1726773023.99027: calling self._execute() 8329 1726773023.99092: variable 'ansible_host' from source: host vars for 'managed_node2' 8329 1726773023.99103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8329 1726773023.99111: variable 'omit' from source: magic vars 8329 1726773023.99525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8329 1726773023.99799: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8329 1726773023.99842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8329 1726773023.99875: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8329 1726773023.99909: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8329 1726773023.99989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8329 1726773024.00013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8329 1726773024.00036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8329 1726773024.00058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8329 1726773024.00167: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8329 1726773024.00175: variable 'omit' from source: magic vars 8329 1726773024.00211: variable 'omit' from source: magic vars 8329 1726773024.00316: variable '__ostree_booted_stat' from source: set_fact 8329 1726773024.00361: variable 'omit' from source: magic vars 8329 1726773024.00383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8329 1726773024.00410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8329 1726773024.00430: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8329 1726773024.00446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8329 1726773024.00457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8329 1726773024.00483: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8329 1726773024.00492: variable 'ansible_host' from source: host vars for 'managed_node2' 8329 1726773024.00496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8329 1726773024.00583: Set connection var ansible_pipelining to False 8329 1726773024.00592: Set connection var ansible_timeout to 10 8329 1726773024.00600: Set connection var ansible_module_compression to ZIP_DEFLATED 8329 1726773024.00603: Set connection var ansible_shell_type to sh 8329 1726773024.00608: Set connection var ansible_shell_executable to /bin/sh 8329 1726773024.00612: Set connection var ansible_connection to ssh 8329 1726773024.00630: variable 'ansible_shell_executable' from source: unknown 8329 1726773024.00633: variable 'ansible_connection' from source: unknown 8329 1726773024.00636: variable 'ansible_module_compression' from source: unknown 8329 1726773024.00638: variable 'ansible_shell_type' from source: unknown 8329 1726773024.00641: variable 'ansible_shell_executable' from source: unknown 8329 1726773024.00643: variable 'ansible_host' from source: host vars for 'managed_node2' 8329 1726773024.00646: variable 'ansible_pipelining' from source: unknown 8329 1726773024.00648: variable 'ansible_timeout' from source: unknown 8329 1726773024.00652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8329 1726773024.00772: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8329 1726773024.00783: variable 'omit' from source: magic vars 8329 1726773024.00791: starting attempt loop 8329 1726773024.00794: running the handler 8329 1726773024.00804: handler run complete 8329 1726773024.00813: attempt loop complete, returning result 8329 1726773024.00816: _execute() done 8329 1726773024.00819: dumping result to json 8329 1726773024.00822: done dumping result, returning 8329 1726773024.00827: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-000000000008] 8329 1726773024.00832: sending task result for task 0affffe7-6841-885f-bbcf-000000000008 8329 1726773024.00854: done sending task result for task 0affffe7-6841-885f-bbcf-000000000008 8329 1726773024.00856: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8240 1726773024.01235: no more pending results, returning what we have 8240 1726773024.01237: results queue empty 8240 1726773024.01238: checking for any_errors_fatal 8240 1726773024.01244: done checking for any_errors_fatal 8240 1726773024.01245: checking for max_fail_percentage 8240 1726773024.01246: done checking for max_fail_percentage 8240 1726773024.01247: checking to see if all hosts have failed and the running result is not ok 8240 1726773024.01248: done checking to see if all hosts have failed 8240 1726773024.01248: getting the remaining hosts for this loop 8240 1726773024.01249: done getting the remaining hosts for this loop 8240 1726773024.01252: getting the next task for host managed_node2 8240 1726773024.01259: done getting next task for host managed_node2 8240 1726773024.01261: ^ task is: TASK: Ensure required packages are installed 8240 1726773024.01262: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773024.01265: getting variables 8240 1726773024.01266: in VariableManager get_vars() 8240 1726773024.01290: Calling all_inventory to load vars for managed_node2 8240 1726773024.01293: Calling groups_inventory to load vars for managed_node2 8240 1726773024.01296: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773024.01305: Calling all_plugins_play to load vars for managed_node2 8240 1726773024.01307: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773024.01310: Calling groups_plugins_play to load vars for managed_node2 8240 1726773024.01491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773024.01677: done with get_vars() 8240 1726773024.01689: done getting variables 8240 1726773024.01776: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure required packages are installed] ********************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 Thursday 19 September 2024 15:10:24 -0400 (0:00:00.034) 0:00:02.661 **** 8240 1726773024.01808: entering _queue_task() for managed_node2/package 8240 1726773024.01810: Creating lock for package 8240 1726773024.01995: worker is 1 (out of 1 available) 8240 1726773024.02007: exiting _queue_task() for managed_node2/package 8240 1726773024.02018: done queuing things up, now waiting for results queue to drain 8240 1726773024.02020: waiting for pending results... 8330 1726773024.02459: running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed 8330 1726773024.02565: in run() - task 0affffe7-6841-885f-bbcf-000000000009 8330 1726773024.02582: variable 'ansible_search_path' from source: unknown 8330 1726773024.02614: calling self._execute() 8330 1726773024.02674: variable 'ansible_host' from source: host vars for 'managed_node2' 8330 1726773024.02683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8330 1726773024.02693: variable 'omit' from source: magic vars 8330 1726773024.02787: variable 'omit' from source: magic vars 8330 1726773024.02819: variable 'omit' from source: magic vars 8330 1726773024.03182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8330 1726773024.04860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8330 1726773024.04910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8330 1726773024.04942: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8330 1726773024.04979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8330 1726773024.05003: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8330 1726773024.05076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8330 1726773024.05099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8330 1726773024.05120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8330 1726773024.05149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8330 1726773024.05163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8330 1726773024.05247: variable '__kernel_settings_is_ostree' from source: set_fact 8330 1726773024.05258: variable 'omit' from source: magic vars 8330 1726773024.05290: variable 'omit' from source: magic vars 8330 1726773024.05311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8330 1726773024.05333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8330 1726773024.05349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8330 1726773024.05366: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8330 1726773024.05376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8330 1726773024.05403: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8330 1726773024.05408: variable 'ansible_host' from source: host vars for 'managed_node2' 8330 1726773024.05412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8330 1726773024.05481: Set connection var ansible_pipelining to False 8330 1726773024.05490: Set connection var ansible_timeout to 10 8330 1726773024.05498: Set connection var ansible_module_compression to ZIP_DEFLATED 8330 1726773024.05501: Set connection var ansible_shell_type to sh 8330 1726773024.05506: Set connection var ansible_shell_executable to /bin/sh 8330 1726773024.05511: Set connection var ansible_connection to ssh 8330 1726773024.05528: variable 'ansible_shell_executable' from source: unknown 8330 1726773024.05532: variable 'ansible_connection' from source: unknown 8330 1726773024.05536: variable 'ansible_module_compression' from source: unknown 8330 1726773024.05539: variable 'ansible_shell_type' from source: unknown 8330 1726773024.05543: variable 'ansible_shell_executable' from source: unknown 8330 1726773024.05545: variable 'ansible_host' from source: host vars for 'managed_node2' 8330 1726773024.05548: variable 'ansible_pipelining' from source: unknown 8330 1726773024.05551: variable 'ansible_timeout' from source: unknown 8330 1726773024.05553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8330 1726773024.05627: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8330 1726773024.05638: variable 'omit' from source: magic vars 8330 1726773024.05647: starting attempt loop 8330 1726773024.05654: running the handler 8330 1726773024.05750: variable 'ansible_facts' from source: unknown 8330 1726773024.05856: _low_level_execute_command(): starting 8330 1726773024.05865: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8330 1726773024.08295: stdout chunk (state=2): >>>/root <<< 8330 1726773024.08488: stderr chunk (state=3): >>><<< 8330 1726773024.08495: stdout chunk (state=3): >>><<< 8330 1726773024.08514: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8330 1726773024.08526: _low_level_execute_command(): starting 8330 1726773024.08532: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825 `" && echo ansible-tmp-1726773024.0852232-8330-228750028484825="` echo /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825 `" ) && sleep 0' 8330 1726773024.11166: stdout chunk (state=2): >>>ansible-tmp-1726773024.0852232-8330-228750028484825=/root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825 <<< 8330 1726773024.11271: stderr chunk (state=3): >>><<< 8330 1726773024.11279: stdout chunk (state=3): >>><<< 8330 1726773024.11301: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773024.0852232-8330-228750028484825=/root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825 , stderr= 8330 1726773024.11331: variable 'ansible_module_compression' from source: unknown 8330 1726773024.11395: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8330 1726773024.11401: ANSIBALLZ: Acquiring lock 8330 1726773024.11404: ANSIBALLZ: Lock acquired: 139787572477392 8330 1726773024.11407: ANSIBALLZ: Creating module 8330 1726773024.24850: ANSIBALLZ: Writing module into payload 8330 1726773024.25052: ANSIBALLZ: Writing module 8330 1726773024.25076: ANSIBALLZ: Renaming module 8330 1726773024.25083: ANSIBALLZ: Done creating module 8330 1726773024.25102: variable 'ansible_facts' from source: unknown 8330 1726773024.25178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/AnsiballZ_dnf.py 8330 1726773024.25286: Sending initial data 8330 1726773024.25293: Sent initial data (150 bytes) 8330 1726773024.27911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpo3ocsd22 /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/AnsiballZ_dnf.py <<< 8330 1726773024.29288: stderr chunk (state=3): >>><<< 8330 1726773024.29296: stdout chunk (state=3): >>><<< 8330 1726773024.29319: done transferring module to remote 8330 1726773024.29331: _low_level_execute_command(): starting 8330 1726773024.29336: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/ /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/AnsiballZ_dnf.py && sleep 0' 8330 1726773024.31677: stderr chunk (state=2): >>><<< 8330 1726773024.31687: stdout chunk (state=2): >>><<< 8330 1726773024.31703: _low_level_execute_command() done: rc=0, stdout=, stderr= 8330 1726773024.31707: _low_level_execute_command(): starting 8330 1726773024.31712: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/AnsiballZ_dnf.py && sleep 0' 8330 1726773029.53620: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8330 1726773029.64498: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8330 1726773029.64544: stderr chunk (state=3): >>><<< 8330 1726773029.64552: stdout chunk (state=3): >>><<< 8330 1726773029.64569: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "procps-ng"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8330 1726773029.64606: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'procps-ng'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8330 1726773029.64616: _low_level_execute_command(): starting 8330 1726773029.64622: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773024.0852232-8330-228750028484825/ > /dev/null 2>&1 && sleep 0' 8330 1726773029.67058: stderr chunk (state=2): >>><<< 8330 1726773029.67070: stdout chunk (state=2): >>><<< 8330 1726773029.67088: _low_level_execute_command() done: rc=0, stdout=, stderr= 8330 1726773029.67097: handler run complete 8330 1726773029.67122: attempt loop complete, returning result 8330 1726773029.67128: _execute() done 8330 1726773029.67131: dumping result to json 8330 1726773029.67138: done dumping result, returning 8330 1726773029.67145: done running TaskExecutor() for managed_node2/TASK: Ensure required packages are installed [0affffe7-6841-885f-bbcf-000000000009] 8330 1726773029.67150: sending task result for task 0affffe7-6841-885f-bbcf-000000000009 8330 1726773029.67180: done sending task result for task 0affffe7-6841-885f-bbcf-000000000009 8330 1726773029.67184: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773029.67325: no more pending results, returning what we have 8240 1726773029.67328: results queue empty 8240 1726773029.67329: checking for any_errors_fatal 8240 1726773029.67333: done checking for any_errors_fatal 8240 1726773029.67334: checking for max_fail_percentage 8240 1726773029.67335: done checking for max_fail_percentage 8240 1726773029.67336: checking to see if all hosts have failed and the running result is not ok 8240 1726773029.67337: done checking to see if all hosts have failed 8240 1726773029.67337: getting the remaining hosts for this loop 8240 1726773029.67338: done getting the remaining hosts for this loop 8240 1726773029.67341: getting the next task for host managed_node2 8240 1726773029.67346: done getting next task for host managed_node2 8240 1726773029.67348: ^ task is: TASK: See if tuned has a profile subdir 8240 1726773029.67349: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773029.67351: getting variables 8240 1726773029.67353: in VariableManager get_vars() 8240 1726773029.67379: Calling all_inventory to load vars for managed_node2 8240 1726773029.67382: Calling groups_inventory to load vars for managed_node2 8240 1726773029.67387: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773029.67398: Calling all_plugins_play to load vars for managed_node2 8240 1726773029.67400: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773029.67403: Calling groups_plugins_play to load vars for managed_node2 8240 1726773029.67537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773029.67642: done with get_vars() 8240 1726773029.67650: done getting variables TASK [See if tuned has a profile subdir] *************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:28 Thursday 19 September 2024 15:10:29 -0400 (0:00:05.659) 0:00:08.321 **** 8240 1726773029.67717: entering _queue_task() for managed_node2/stat 8240 1726773029.67875: worker is 1 (out of 1 available) 8240 1726773029.67889: exiting _queue_task() for managed_node2/stat 8240 1726773029.67901: done queuing things up, now waiting for results queue to drain 8240 1726773029.67903: waiting for pending results... 8436 1726773029.67999: running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir 8436 1726773029.68090: in run() - task 0affffe7-6841-885f-bbcf-00000000000a 8436 1726773029.68107: variable 'ansible_search_path' from source: unknown 8436 1726773029.68134: calling self._execute() 8436 1726773029.68192: variable 'ansible_host' from source: host vars for 'managed_node2' 8436 1726773029.68201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8436 1726773029.68209: variable 'omit' from source: magic vars 8436 1726773029.68281: variable 'omit' from source: magic vars 8436 1726773029.68308: variable 'omit' from source: magic vars 8436 1726773029.68333: variable 'omit' from source: magic vars 8436 1726773029.68367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8436 1726773029.68442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8436 1726773029.68463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8436 1726773029.68479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8436 1726773029.68492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8436 1726773029.68515: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8436 1726773029.68520: variable 'ansible_host' from source: host vars for 'managed_node2' 8436 1726773029.68525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8436 1726773029.68599: Set connection var ansible_pipelining to False 8436 1726773029.68606: Set connection var ansible_timeout to 10 8436 1726773029.68615: Set connection var ansible_module_compression to ZIP_DEFLATED 8436 1726773029.68618: Set connection var ansible_shell_type to sh 8436 1726773029.68623: Set connection var ansible_shell_executable to /bin/sh 8436 1726773029.68628: Set connection var ansible_connection to ssh 8436 1726773029.68644: variable 'ansible_shell_executable' from source: unknown 8436 1726773029.68649: variable 'ansible_connection' from source: unknown 8436 1726773029.68652: variable 'ansible_module_compression' from source: unknown 8436 1726773029.68655: variable 'ansible_shell_type' from source: unknown 8436 1726773029.68659: variable 'ansible_shell_executable' from source: unknown 8436 1726773029.68662: variable 'ansible_host' from source: host vars for 'managed_node2' 8436 1726773029.68666: variable 'ansible_pipelining' from source: unknown 8436 1726773029.68669: variable 'ansible_timeout' from source: unknown 8436 1726773029.68673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8436 1726773029.68814: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8436 1726773029.68824: variable 'omit' from source: magic vars 8436 1726773029.68828: starting attempt loop 8436 1726773029.68830: running the handler 8436 1726773029.68839: _low_level_execute_command(): starting 8436 1726773029.68844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8436 1726773029.71235: stdout chunk (state=2): >>>/root <<< 8436 1726773029.71357: stderr chunk (state=3): >>><<< 8436 1726773029.71367: stdout chunk (state=3): >>><<< 8436 1726773029.71389: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8436 1726773029.71404: _low_level_execute_command(): starting 8436 1726773029.71412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895 `" && echo ansible-tmp-1726773029.7139935-8436-173204940093895="` echo /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895 `" ) && sleep 0' 8436 1726773029.73922: stdout chunk (state=2): >>>ansible-tmp-1726773029.7139935-8436-173204940093895=/root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895 <<< 8436 1726773029.74051: stderr chunk (state=3): >>><<< 8436 1726773029.74058: stdout chunk (state=3): >>><<< 8436 1726773029.74076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.7139935-8436-173204940093895=/root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895 , stderr= 8436 1726773029.74117: variable 'ansible_module_compression' from source: unknown 8436 1726773029.74167: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8436 1726773029.74199: variable 'ansible_facts' from source: unknown 8436 1726773029.74268: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/AnsiballZ_stat.py 8436 1726773029.74374: Sending initial data 8436 1726773029.74381: Sent initial data (151 bytes) 8436 1726773029.76872: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpztqzeg0d /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/AnsiballZ_stat.py <<< 8436 1726773029.77980: stderr chunk (state=3): >>><<< 8436 1726773029.77990: stdout chunk (state=3): >>><<< 8436 1726773029.78011: done transferring module to remote 8436 1726773029.78022: _low_level_execute_command(): starting 8436 1726773029.78027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/ /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/AnsiballZ_stat.py && sleep 0' 8436 1726773029.80793: stderr chunk (state=2): >>><<< 8436 1726773029.80804: stdout chunk (state=2): >>><<< 8436 1726773029.80821: _low_level_execute_command() done: rc=0, stdout=, stderr= 8436 1726773029.80826: _low_level_execute_command(): starting 8436 1726773029.80832: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/AnsiballZ_stat.py && sleep 0' 8436 1726773029.95998: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8436 1726773029.97093: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8436 1726773029.97107: stdout chunk (state=3): >>><<< 8436 1726773029.97120: stderr chunk (state=3): >>><<< 8436 1726773029.97133: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 8436 1726773029.97181: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8436 1726773029.97194: _low_level_execute_command(): starting 8436 1726773029.97200: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.7139935-8436-173204940093895/ > /dev/null 2>&1 && sleep 0' 8436 1726773029.99877: stderr chunk (state=2): >>><<< 8436 1726773029.99891: stdout chunk (state=2): >>><<< 8436 1726773029.99909: _low_level_execute_command() done: rc=0, stdout=, stderr= 8436 1726773029.99919: handler run complete 8436 1726773029.99941: attempt loop complete, returning result 8436 1726773029.99946: _execute() done 8436 1726773029.99949: dumping result to json 8436 1726773029.99953: done dumping result, returning 8436 1726773029.99960: done running TaskExecutor() for managed_node2/TASK: See if tuned has a profile subdir [0affffe7-6841-885f-bbcf-00000000000a] 8436 1726773029.99968: sending task result for task 0affffe7-6841-885f-bbcf-00000000000a 8436 1726773030.00005: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000a 8436 1726773030.00009: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8240 1726773030.00372: no more pending results, returning what we have 8240 1726773030.00375: results queue empty 8240 1726773030.00376: checking for any_errors_fatal 8240 1726773030.00382: done checking for any_errors_fatal 8240 1726773030.00383: checking for max_fail_percentage 8240 1726773030.00384: done checking for max_fail_percentage 8240 1726773030.00387: checking to see if all hosts have failed and the running result is not ok 8240 1726773030.00387: done checking to see if all hosts have failed 8240 1726773030.00388: getting the remaining hosts for this loop 8240 1726773030.00389: done getting the remaining hosts for this loop 8240 1726773030.00392: getting the next task for host managed_node2 8240 1726773030.00397: done getting next task for host managed_node2 8240 1726773030.00399: ^ task is: TASK: Set profile dir 8240 1726773030.00400: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773030.00403: getting variables 8240 1726773030.00404: in VariableManager get_vars() 8240 1726773030.00430: Calling all_inventory to load vars for managed_node2 8240 1726773030.00433: Calling groups_inventory to load vars for managed_node2 8240 1726773030.00435: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773030.00445: Calling all_plugins_play to load vars for managed_node2 8240 1726773030.00448: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773030.00451: Calling groups_plugins_play to load vars for managed_node2 8240 1726773030.00654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773030.00842: done with get_vars() 8240 1726773030.00851: done getting variables 8240 1726773030.00910: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set profile dir] ********************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:33 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.332) 0:00:08.653 **** 8240 1726773030.00936: entering _queue_task() for managed_node2/set_fact 8240 1726773030.01127: worker is 1 (out of 1 available) 8240 1726773030.01139: exiting _queue_task() for managed_node2/set_fact 8240 1726773030.01152: done queuing things up, now waiting for results queue to drain 8240 1726773030.01153: waiting for pending results... 8449 1726773030.01363: running TaskExecutor() for managed_node2/TASK: Set profile dir 8449 1726773030.01469: in run() - task 0affffe7-6841-885f-bbcf-00000000000b 8449 1726773030.01488: variable 'ansible_search_path' from source: unknown 8449 1726773030.01517: calling self._execute() 8449 1726773030.01579: variable 'ansible_host' from source: host vars for 'managed_node2' 8449 1726773030.01590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8449 1726773030.01599: variable 'omit' from source: magic vars 8449 1726773030.01692: variable 'omit' from source: magic vars 8449 1726773030.01724: variable 'omit' from source: magic vars 8449 1726773030.02039: variable '__dir' from source: task vars 8449 1726773030.02159: variable '__tuned_profiles' from source: set_fact 8449 1726773030.02203: variable 'omit' from source: magic vars 8449 1726773030.02243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8449 1726773030.02278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8449 1726773030.02355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8449 1726773030.02374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8449 1726773030.02387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8449 1726773030.02417: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8449 1726773030.02422: variable 'ansible_host' from source: host vars for 'managed_node2' 8449 1726773030.02426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8449 1726773030.02515: Set connection var ansible_pipelining to False 8449 1726773030.02523: Set connection var ansible_timeout to 10 8449 1726773030.02529: Set connection var ansible_module_compression to ZIP_DEFLATED 8449 1726773030.02532: Set connection var ansible_shell_type to sh 8449 1726773030.02536: Set connection var ansible_shell_executable to /bin/sh 8449 1726773030.02540: Set connection var ansible_connection to ssh 8449 1726773030.02558: variable 'ansible_shell_executable' from source: unknown 8449 1726773030.02562: variable 'ansible_connection' from source: unknown 8449 1726773030.02567: variable 'ansible_module_compression' from source: unknown 8449 1726773030.02570: variable 'ansible_shell_type' from source: unknown 8449 1726773030.02573: variable 'ansible_shell_executable' from source: unknown 8449 1726773030.02575: variable 'ansible_host' from source: host vars for 'managed_node2' 8449 1726773030.02578: variable 'ansible_pipelining' from source: unknown 8449 1726773030.02581: variable 'ansible_timeout' from source: unknown 8449 1726773030.02584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8449 1726773030.02702: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8449 1726773030.02713: variable 'omit' from source: magic vars 8449 1726773030.02719: starting attempt loop 8449 1726773030.02722: running the handler 8449 1726773030.02733: handler run complete 8449 1726773030.02743: attempt loop complete, returning result 8449 1726773030.02746: _execute() done 8449 1726773030.02749: dumping result to json 8449 1726773030.02752: done dumping result, returning 8449 1726773030.02756: done running TaskExecutor() for managed_node2/TASK: Set profile dir [0affffe7-6841-885f-bbcf-00000000000b] 8449 1726773030.02762: sending task result for task 0affffe7-6841-885f-bbcf-00000000000b 8449 1726773030.02790: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000b 8449 1726773030.02793: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__profile_dir": "/etc/tuned/kernel_settings" }, "changed": false } 8240 1726773030.03146: no more pending results, returning what we have 8240 1726773030.03149: results queue empty 8240 1726773030.03149: checking for any_errors_fatal 8240 1726773030.03155: done checking for any_errors_fatal 8240 1726773030.03156: checking for max_fail_percentage 8240 1726773030.03157: done checking for max_fail_percentage 8240 1726773030.03158: checking to see if all hosts have failed and the running result is not ok 8240 1726773030.03159: done checking to see if all hosts have failed 8240 1726773030.03159: getting the remaining hosts for this loop 8240 1726773030.03161: done getting the remaining hosts for this loop 8240 1726773030.03166: getting the next task for host managed_node2 8240 1726773030.03170: done getting next task for host managed_node2 8240 1726773030.03173: ^ task is: TASK: Ensure kernel settings profile directory exists 8240 1726773030.03174: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773030.03177: getting variables 8240 1726773030.03178: in VariableManager get_vars() 8240 1726773030.03202: Calling all_inventory to load vars for managed_node2 8240 1726773030.03205: Calling groups_inventory to load vars for managed_node2 8240 1726773030.03208: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773030.03216: Calling all_plugins_play to load vars for managed_node2 8240 1726773030.03219: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773030.03221: Calling groups_plugins_play to load vars for managed_node2 8240 1726773030.03387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773030.03571: done with get_vars() 8240 1726773030.03581: done getting variables TASK [Ensure kernel settings profile directory exists] ************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:39 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.027) 0:00:08.680 **** 8240 1726773030.03661: entering _queue_task() for managed_node2/file 8240 1726773030.03851: worker is 1 (out of 1 available) 8240 1726773030.03867: exiting _queue_task() for managed_node2/file 8240 1726773030.03879: done queuing things up, now waiting for results queue to drain 8240 1726773030.03881: waiting for pending results... 8450 1726773030.04367: running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists 8450 1726773030.04474: in run() - task 0affffe7-6841-885f-bbcf-00000000000c 8450 1726773030.04493: variable 'ansible_search_path' from source: unknown 8450 1726773030.04526: calling self._execute() 8450 1726773030.04595: variable 'ansible_host' from source: host vars for 'managed_node2' 8450 1726773030.04605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8450 1726773030.04613: variable 'omit' from source: magic vars 8450 1726773030.04711: variable 'omit' from source: magic vars 8450 1726773030.04744: variable 'omit' from source: magic vars 8450 1726773030.04775: variable '__profile_dir' from source: set_fact 8450 1726773030.05104: variable '__profile_dir' from source: set_fact 8450 1726773030.05134: variable 'omit' from source: magic vars 8450 1726773030.05174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8450 1726773030.05209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8450 1726773030.05229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8450 1726773030.05247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8450 1726773030.05259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8450 1726773030.05353: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8450 1726773030.05360: variable 'ansible_host' from source: host vars for 'managed_node2' 8450 1726773030.05367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8450 1726773030.05470: Set connection var ansible_pipelining to False 8450 1726773030.05479: Set connection var ansible_timeout to 10 8450 1726773030.05489: Set connection var ansible_module_compression to ZIP_DEFLATED 8450 1726773030.05493: Set connection var ansible_shell_type to sh 8450 1726773030.05499: Set connection var ansible_shell_executable to /bin/sh 8450 1726773030.05503: Set connection var ansible_connection to ssh 8450 1726773030.05522: variable 'ansible_shell_executable' from source: unknown 8450 1726773030.05526: variable 'ansible_connection' from source: unknown 8450 1726773030.05530: variable 'ansible_module_compression' from source: unknown 8450 1726773030.05532: variable 'ansible_shell_type' from source: unknown 8450 1726773030.05536: variable 'ansible_shell_executable' from source: unknown 8450 1726773030.05538: variable 'ansible_host' from source: host vars for 'managed_node2' 8450 1726773030.05542: variable 'ansible_pipelining' from source: unknown 8450 1726773030.05544: variable 'ansible_timeout' from source: unknown 8450 1726773030.05548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8450 1726773030.05727: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8450 1726773030.05739: variable 'omit' from source: magic vars 8450 1726773030.05745: starting attempt loop 8450 1726773030.05749: running the handler 8450 1726773030.05761: _low_level_execute_command(): starting 8450 1726773030.05771: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8450 1726773030.08900: stdout chunk (state=2): >>>/root <<< 8450 1726773030.08915: stderr chunk (state=2): >>><<< 8450 1726773030.08929: stdout chunk (state=3): >>><<< 8450 1726773030.08947: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8450 1726773030.08963: _low_level_execute_command(): starting 8450 1726773030.08973: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175 `" && echo ansible-tmp-1726773030.0895622-8450-208103307337175="` echo /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175 `" ) && sleep 0' 8450 1726773030.12101: stdout chunk (state=2): >>>ansible-tmp-1726773030.0895622-8450-208103307337175=/root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175 <<< 8450 1726773030.12259: stderr chunk (state=3): >>><<< 8450 1726773030.12269: stdout chunk (state=3): >>><<< 8450 1726773030.12291: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773030.0895622-8450-208103307337175=/root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175 , stderr= 8450 1726773030.12334: variable 'ansible_module_compression' from source: unknown 8450 1726773030.12394: ANSIBALLZ: Using lock for file 8450 1726773030.12400: ANSIBALLZ: Acquiring lock 8450 1726773030.12403: ANSIBALLZ: Lock acquired: 139787572728400 8450 1726773030.12407: ANSIBALLZ: Creating module 8450 1726773030.25101: ANSIBALLZ: Writing module into payload 8450 1726773030.25289: ANSIBALLZ: Writing module 8450 1726773030.25311: ANSIBALLZ: Renaming module 8450 1726773030.25322: ANSIBALLZ: Done creating module 8450 1726773030.25339: variable 'ansible_facts' from source: unknown 8450 1726773030.25419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/AnsiballZ_file.py 8450 1726773030.25573: Sending initial data 8450 1726773030.25580: Sent initial data (151 bytes) 8450 1726773030.28373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpfvwtdz1h /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/AnsiballZ_file.py <<< 8450 1726773030.29537: stderr chunk (state=3): >>><<< 8450 1726773030.29547: stdout chunk (state=3): >>><<< 8450 1726773030.29571: done transferring module to remote 8450 1726773030.29583: _low_level_execute_command(): starting 8450 1726773030.29590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/ /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/AnsiballZ_file.py && sleep 0' 8450 1726773030.31956: stderr chunk (state=2): >>><<< 8450 1726773030.31968: stdout chunk (state=2): >>><<< 8450 1726773030.31983: _low_level_execute_command() done: rc=0, stdout=, stderr= 8450 1726773030.31990: _low_level_execute_command(): starting 8450 1726773030.31996: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/AnsiballZ_file.py && sleep 0' 8450 1726773030.47901: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8450 1726773030.49032: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8450 1726773030.49086: stderr chunk (state=3): >>><<< 8450 1726773030.49094: stdout chunk (state=3): >>><<< 8450 1726773030.49113: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8450 1726773030.49147: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8450 1726773030.49157: _low_level_execute_command(): starting 8450 1726773030.49163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773030.0895622-8450-208103307337175/ > /dev/null 2>&1 && sleep 0' 8450 1726773030.51581: stderr chunk (state=2): >>><<< 8450 1726773030.51593: stdout chunk (state=2): >>><<< 8450 1726773030.51608: _low_level_execute_command() done: rc=0, stdout=, stderr= 8450 1726773030.51615: handler run complete 8450 1726773030.51632: attempt loop complete, returning result 8450 1726773030.51636: _execute() done 8450 1726773030.51638: dumping result to json 8450 1726773030.51644: done dumping result, returning 8450 1726773030.51651: done running TaskExecutor() for managed_node2/TASK: Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-00000000000c] 8450 1726773030.51656: sending task result for task 0affffe7-6841-885f-bbcf-00000000000c 8450 1726773030.51696: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000c 8450 1726773030.51700: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8240 1726773030.51848: no more pending results, returning what we have 8240 1726773030.51851: results queue empty 8240 1726773030.51852: checking for any_errors_fatal 8240 1726773030.51856: done checking for any_errors_fatal 8240 1726773030.51857: checking for max_fail_percentage 8240 1726773030.51858: done checking for max_fail_percentage 8240 1726773030.51859: checking to see if all hosts have failed and the running result is not ok 8240 1726773030.51859: done checking to see if all hosts have failed 8240 1726773030.51860: getting the remaining hosts for this loop 8240 1726773030.51861: done getting the remaining hosts for this loop 8240 1726773030.51864: getting the next task for host managed_node2 8240 1726773030.51868: done getting next task for host managed_node2 8240 1726773030.51870: ^ task is: TASK: Generate a configuration for kernel settings 8240 1726773030.51872: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773030.51874: getting variables 8240 1726773030.51876: in VariableManager get_vars() 8240 1726773030.51904: Calling all_inventory to load vars for managed_node2 8240 1726773030.51906: Calling groups_inventory to load vars for managed_node2 8240 1726773030.51909: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773030.51919: Calling all_plugins_play to load vars for managed_node2 8240 1726773030.51921: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773030.51924: Calling groups_plugins_play to load vars for managed_node2 8240 1726773030.52069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773030.52180: done with get_vars() 8240 1726773030.52189: done getting variables 8240 1726773030.52277: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Generate a configuration for kernel settings] **************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.486) 0:00:09.166 **** 8240 1726773030.52299: entering _queue_task() for managed_node2/copy 8240 1726773030.52442: worker is 1 (out of 1 available) 8240 1726773030.52452: exiting _queue_task() for managed_node2/copy 8240 1726773030.52463: done queuing things up, now waiting for results queue to drain 8240 1726773030.52464: waiting for pending results... 8469 1726773030.52615: running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings 8469 1726773030.52705: in run() - task 0affffe7-6841-885f-bbcf-00000000000d 8469 1726773030.52721: variable 'ansible_search_path' from source: unknown 8469 1726773030.52751: calling self._execute() 8469 1726773030.52810: variable 'ansible_host' from source: host vars for 'managed_node2' 8469 1726773030.52818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8469 1726773030.52827: variable 'omit' from source: magic vars 8469 1726773030.52902: variable 'omit' from source: magic vars 8469 1726773030.52928: variable 'omit' from source: magic vars 8469 1726773030.53154: variable '__profile_dir' from source: set_fact 8469 1726773030.53182: variable 'omit' from source: magic vars 8469 1726773030.53214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8469 1726773030.53238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8469 1726773030.53253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8469 1726773030.53265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8469 1726773030.53274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8469 1726773030.53299: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8469 1726773030.53303: variable 'ansible_host' from source: host vars for 'managed_node2' 8469 1726773030.53306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8469 1726773030.53371: Set connection var ansible_pipelining to False 8469 1726773030.53376: Set connection var ansible_timeout to 10 8469 1726773030.53381: Set connection var ansible_module_compression to ZIP_DEFLATED 8469 1726773030.53383: Set connection var ansible_shell_type to sh 8469 1726773030.53389: Set connection var ansible_shell_executable to /bin/sh 8469 1726773030.53392: Set connection var ansible_connection to ssh 8469 1726773030.53406: variable 'ansible_shell_executable' from source: unknown 8469 1726773030.53409: variable 'ansible_connection' from source: unknown 8469 1726773030.53411: variable 'ansible_module_compression' from source: unknown 8469 1726773030.53413: variable 'ansible_shell_type' from source: unknown 8469 1726773030.53414: variable 'ansible_shell_executable' from source: unknown 8469 1726773030.53416: variable 'ansible_host' from source: host vars for 'managed_node2' 8469 1726773030.53418: variable 'ansible_pipelining' from source: unknown 8469 1726773030.53419: variable 'ansible_timeout' from source: unknown 8469 1726773030.53422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8469 1726773030.53539: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8469 1726773030.53550: variable 'omit' from source: magic vars 8469 1726773030.53557: starting attempt loop 8469 1726773030.53560: running the handler 8469 1726773030.53572: _low_level_execute_command(): starting 8469 1726773030.53580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8469 1726773030.55881: stdout chunk (state=2): >>>/root <<< 8469 1726773030.56004: stderr chunk (state=3): >>><<< 8469 1726773030.56012: stdout chunk (state=3): >>><<< 8469 1726773030.56031: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8469 1726773030.56045: _low_level_execute_command(): starting 8469 1726773030.56051: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732 `" && echo ansible-tmp-1726773030.5604022-8469-7540192941732="` echo /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732 `" ) && sleep 0' 8469 1726773030.58544: stdout chunk (state=2): >>>ansible-tmp-1726773030.5604022-8469-7540192941732=/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732 <<< 8469 1726773030.58669: stderr chunk (state=3): >>><<< 8469 1726773030.58676: stdout chunk (state=3): >>><<< 8469 1726773030.58694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773030.5604022-8469-7540192941732=/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732 , stderr= 8469 1726773030.58706: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings 8469 1726773030.58725: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/files/tuned/etc/tuned/change_settings/tuned.conf /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf 8469 1726773030.58780: variable 'ansible_module_compression' from source: unknown 8469 1726773030.58827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8469 1726773030.58856: variable 'ansible_facts' from source: unknown 8469 1726773030.58928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_stat.py 8469 1726773030.59031: Sending initial data 8469 1726773030.59038: Sent initial data (149 bytes) 8469 1726773030.61928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp48f_g2h4 /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_stat.py <<< 8469 1726773030.63393: stderr chunk (state=3): >>><<< 8469 1726773030.63404: stdout chunk (state=3): >>><<< 8469 1726773030.63428: done transferring module to remote 8469 1726773030.63441: _low_level_execute_command(): starting 8469 1726773030.63447: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/ /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_stat.py && sleep 0' 8469 1726773030.66172: stderr chunk (state=2): >>><<< 8469 1726773030.66184: stdout chunk (state=2): >>><<< 8469 1726773030.66204: _low_level_execute_command() done: rc=0, stdout=, stderr= 8469 1726773030.66209: _low_level_execute_command(): starting 8469 1726773030.66215: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_stat.py && sleep 0' 8469 1726773030.81030: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8469 1726773030.82043: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8469 1726773030.82093: stderr chunk (state=3): >>><<< 8469 1726773030.82102: stdout chunk (state=3): >>><<< 8469 1726773030.82118: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 8469 1726773030.82146: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8469 1726773030.82233: Sending initial data 8469 1726773030.82240: Sent initial data (212 bytes) 8469 1726773030.84860: stdout chunk (state=3): >>>sftp> put /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tuned/etc/tuned/change_settings/tuned.conf /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source <<< 8469 1726773030.85242: stderr chunk (state=3): >>><<< 8469 1726773030.85249: stdout chunk (state=3): >>><<< 8469 1726773030.85267: _low_level_execute_command(): starting 8469 1726773030.85274: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/ /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source && sleep 0' 8469 1726773030.87609: stderr chunk (state=2): >>><<< 8469 1726773030.87618: stdout chunk (state=2): >>><<< 8469 1726773030.87632: _low_level_execute_command() done: rc=0, stdout=, stderr= 8469 1726773030.87655: variable 'ansible_module_compression' from source: unknown 8469 1726773030.87695: ANSIBALLZ: Using generic lock for ansible.legacy.copy 8469 1726773030.87700: ANSIBALLZ: Acquiring lock 8469 1726773030.87703: ANSIBALLZ: Lock acquired: 139787572477392 8469 1726773030.87707: ANSIBALLZ: Creating module 8469 1726773030.99945: ANSIBALLZ: Writing module into payload 8469 1726773031.00138: ANSIBALLZ: Writing module 8469 1726773031.00167: ANSIBALLZ: Renaming module 8469 1726773031.00176: ANSIBALLZ: Done creating module 8469 1726773031.00193: variable 'ansible_facts' from source: unknown 8469 1726773031.00274: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_copy.py 8469 1726773031.00752: Sending initial data 8469 1726773031.00759: Sent initial data (149 bytes) 8469 1726773031.03693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpaezwhfhl /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_copy.py <<< 8469 1726773031.06091: stderr chunk (state=3): >>><<< 8469 1726773031.06100: stdout chunk (state=3): >>><<< 8469 1726773031.06120: done transferring module to remote 8469 1726773031.06130: _low_level_execute_command(): starting 8469 1726773031.06136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/ /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_copy.py && sleep 0' 8469 1726773031.11216: stderr chunk (state=2): >>><<< 8469 1726773031.11227: stdout chunk (state=2): >>><<< 8469 1726773031.11243: _low_level_execute_command() done: rc=0, stdout=, stderr= 8469 1726773031.11247: _low_level_execute_command(): starting 8469 1726773031.11252: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/AnsiballZ_copy.py && sleep 0' 8469 1726773031.28404: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8469 1726773031.29654: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8469 1726773031.29667: stdout chunk (state=3): >>><<< 8469 1726773031.29679: stderr chunk (state=3): >>><<< 8469 1726773031.29694: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source", "md5sum": "d5df32baf1a63528844555117ead6672", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "_original_basename": "tuned.conf", "follow": false, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8469 1726773031.29728: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', '_original_basename': 'tuned.conf', 'follow': False, 'checksum': '13fdc203370e2b8e7e42c13d94b671b1ac621563', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8469 1726773031.29738: _low_level_execute_command(): starting 8469 1726773031.29743: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/ > /dev/null 2>&1 && sleep 0' 8469 1726773031.32415: stderr chunk (state=2): >>><<< 8469 1726773031.32424: stdout chunk (state=2): >>><<< 8469 1726773031.32437: _low_level_execute_command() done: rc=0, stdout=, stderr= 8469 1726773031.32443: handler run complete 8469 1726773031.32460: attempt loop complete, returning result 8469 1726773031.32463: _execute() done 8469 1726773031.32465: dumping result to json 8469 1726773031.32470: done dumping result, returning 8469 1726773031.32475: done running TaskExecutor() for managed_node2/TASK: Generate a configuration for kernel settings [0affffe7-6841-885f-bbcf-00000000000d] 8469 1726773031.32479: sending task result for task 0affffe7-6841-885f-bbcf-00000000000d 8469 1726773031.32506: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000d 8469 1726773031.32508: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "d5df32baf1a63528844555117ead6672", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 381, "src": "/root/.ansible/tmp/ansible-tmp-1726773030.5604022-8469-7540192941732/source", "state": "file", "uid": 0 } 8240 1726773031.32749: no more pending results, returning what we have 8240 1726773031.32752: results queue empty 8240 1726773031.32753: checking for any_errors_fatal 8240 1726773031.32759: done checking for any_errors_fatal 8240 1726773031.32760: checking for max_fail_percentage 8240 1726773031.32761: done checking for max_fail_percentage 8240 1726773031.32761: checking to see if all hosts have failed and the running result is not ok 8240 1726773031.32762: done checking to see if all hosts have failed 8240 1726773031.32762: getting the remaining hosts for this loop 8240 1726773031.32763: done getting the remaining hosts for this loop 8240 1726773031.32768: getting the next task for host managed_node2 8240 1726773031.32771: done getting next task for host managed_node2 8240 1726773031.32773: ^ task is: TASK: Ensure required services are enabled and started 8240 1726773031.32774: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773031.32776: getting variables 8240 1726773031.32777: in VariableManager get_vars() 8240 1726773031.32804: Calling all_inventory to load vars for managed_node2 8240 1726773031.32806: Calling groups_inventory to load vars for managed_node2 8240 1726773031.32808: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773031.32815: Calling all_plugins_play to load vars for managed_node2 8240 1726773031.32817: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773031.32819: Calling groups_plugins_play to load vars for managed_node2 8240 1726773031.32956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773031.33127: done with get_vars() 8240 1726773031.33140: done getting variables 8240 1726773031.33244: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure required services are enabled and started] ************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.809) 0:00:09.976 **** 8240 1726773031.33275: entering _queue_task() for managed_node2/service 8240 1726773031.33276: Creating lock for service 8240 1726773031.33514: worker is 1 (out of 1 available) 8240 1726773031.33529: exiting _queue_task() for managed_node2/service 8240 1726773031.33545: done queuing things up, now waiting for results queue to drain 8240 1726773031.33547: waiting for pending results... 8515 1726773031.33780: running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started 8515 1726773031.33883: in run() - task 0affffe7-6841-885f-bbcf-00000000000e 8515 1726773031.33901: variable 'ansible_search_path' from source: unknown 8515 1726773031.33929: calling self._execute() 8515 1726773031.33984: variable 'ansible_host' from source: host vars for 'managed_node2' 8515 1726773031.33993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8515 1726773031.34001: variable 'omit' from source: magic vars 8515 1726773031.34073: variable 'omit' from source: magic vars 8515 1726773031.34102: variable 'omit' from source: magic vars 8515 1726773031.34139: variable 'omit' from source: magic vars 8515 1726773031.34174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8515 1726773031.34202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8515 1726773031.34228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8515 1726773031.34243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8515 1726773031.34254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8515 1726773031.34278: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8515 1726773031.34282: variable 'ansible_host' from source: host vars for 'managed_node2' 8515 1726773031.34286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8515 1726773031.34373: Set connection var ansible_pipelining to False 8515 1726773031.34382: Set connection var ansible_timeout to 10 8515 1726773031.34391: Set connection var ansible_module_compression to ZIP_DEFLATED 8515 1726773031.34395: Set connection var ansible_shell_type to sh 8515 1726773031.34400: Set connection var ansible_shell_executable to /bin/sh 8515 1726773031.34404: Set connection var ansible_connection to ssh 8515 1726773031.34423: variable 'ansible_shell_executable' from source: unknown 8515 1726773031.34427: variable 'ansible_connection' from source: unknown 8515 1726773031.34430: variable 'ansible_module_compression' from source: unknown 8515 1726773031.34433: variable 'ansible_shell_type' from source: unknown 8515 1726773031.34435: variable 'ansible_shell_executable' from source: unknown 8515 1726773031.34438: variable 'ansible_host' from source: host vars for 'managed_node2' 8515 1726773031.34441: variable 'ansible_pipelining' from source: unknown 8515 1726773031.34444: variable 'ansible_timeout' from source: unknown 8515 1726773031.34447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8515 1726773031.34574: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8515 1726773031.34592: variable 'omit' from source: magic vars 8515 1726773031.34600: starting attempt loop 8515 1726773031.34602: running the handler 8515 1726773031.34926: variable 'ansible_facts' from source: unknown 8515 1726773031.35060: _low_level_execute_command(): starting 8515 1726773031.35074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8515 1726773031.38331: stdout chunk (state=2): >>>/root <<< 8515 1726773031.38457: stderr chunk (state=3): >>><<< 8515 1726773031.38466: stdout chunk (state=3): >>><<< 8515 1726773031.38488: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8515 1726773031.38502: _low_level_execute_command(): starting 8515 1726773031.38509: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883 `" && echo ansible-tmp-1726773031.3849745-8515-110564091113883="` echo /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883 `" ) && sleep 0' 8515 1726773031.41085: stdout chunk (state=2): >>>ansible-tmp-1726773031.3849745-8515-110564091113883=/root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883 <<< 8515 1726773031.41224: stderr chunk (state=3): >>><<< 8515 1726773031.41232: stdout chunk (state=3): >>><<< 8515 1726773031.41248: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.3849745-8515-110564091113883=/root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883 , stderr= 8515 1726773031.41274: variable 'ansible_module_compression' from source: unknown 8515 1726773031.41325: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8515 1726773031.41330: ANSIBALLZ: Acquiring lock 8515 1726773031.41333: ANSIBALLZ: Lock acquired: 139787572477392 8515 1726773031.41337: ANSIBALLZ: Creating module 8515 1726773031.68246: ANSIBALLZ: Writing module into payload 8515 1726773031.68467: ANSIBALLZ: Writing module 8515 1726773031.68501: ANSIBALLZ: Renaming module 8515 1726773031.68509: ANSIBALLZ: Done creating module 8515 1726773031.68529: variable 'ansible_facts' from source: unknown 8515 1726773031.68753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/AnsiballZ_systemd.py 8515 1726773031.69222: Sending initial data 8515 1726773031.69229: Sent initial data (154 bytes) 8515 1726773031.71759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpf8uft8fq /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/AnsiballZ_systemd.py <<< 8515 1726773031.74156: stderr chunk (state=3): >>><<< 8515 1726773031.74167: stdout chunk (state=3): >>><<< 8515 1726773031.74194: done transferring module to remote 8515 1726773031.74206: _low_level_execute_command(): starting 8515 1726773031.74212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/ /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/AnsiballZ_systemd.py && sleep 0' 8515 1726773031.76815: stderr chunk (state=2): >>><<< 8515 1726773031.76826: stdout chunk (state=2): >>><<< 8515 1726773031.76841: _low_level_execute_command() done: rc=0, stdout=, stderr= 8515 1726773031.76848: _low_level_execute_command(): starting 8515 1726773031.76854: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/AnsiballZ_systemd.py && sleep 0' 8515 1726773032.41889: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:45 EDT", "WatchdogTimestampMonotonic": "21263547", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "686", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ExecMainStartTimestampMonotonic": "20363959", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "686", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:44 EDT] ; stop_time=[n/a] ; pid=686 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:45 EDT", "StateChangeTimestampMonotonic": "21263550", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:44 EDT", "InactiveExitTimestampMonotonic": "20364018", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:45 EDT", "ActiveEnterTimestampMonotonic": "21263550", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ConditionTimestampMonotonic": "20362941", "AssertTimestamp": "Thu 2024-09-19 15:03:44 EDT", "AssertTimestampMonotonic": "20362942", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7c00372bd2eb4de19310fd12d82bc5f0", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8515 1726773032.43581: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8515 1726773032.43631: stderr chunk (state=3): >>><<< 8515 1726773032.43638: stdout chunk (state=3): >>><<< 8515 1726773032.43658: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:03:45 EDT", "WatchdogTimestampMonotonic": "21263547", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "686", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ExecMainStartTimestampMonotonic": "20363959", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "686", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:44 EDT] ; stop_time=[n/a] ; pid=686 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:03:45 EDT", "StateChangeTimestampMonotonic": "21263550", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:44 EDT", "InactiveExitTimestampMonotonic": "20364018", "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:45 EDT", "ActiveEnterTimestampMonotonic": "21263550", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ConditionTimestampMonotonic": "20362941", "AssertTimestamp": "Thu 2024-09-19 15:03:44 EDT", "AssertTimestampMonotonic": "20362942", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "7c00372bd2eb4de19310fd12d82bc5f0", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8515 1726773032.43777: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8515 1726773032.43797: _low_level_execute_command(): starting 8515 1726773032.43804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.3849745-8515-110564091113883/ > /dev/null 2>&1 && sleep 0' 8515 1726773032.46600: stderr chunk (state=2): >>><<< 8515 1726773032.46611: stdout chunk (state=2): >>><<< 8515 1726773032.46627: _low_level_execute_command() done: rc=0, stdout=, stderr= 8515 1726773032.46635: handler run complete 8515 1726773032.46693: attempt loop complete, returning result 8515 1726773032.46700: _execute() done 8515 1726773032.46703: dumping result to json 8515 1726773032.46717: done dumping result, returning 8515 1726773032.46723: done running TaskExecutor() for managed_node2/TASK: Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-00000000000e] 8515 1726773032.46728: sending task result for task 0affffe7-6841-885f-bbcf-00000000000e 8515 1726773032.46843: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000e 8515 1726773032.46847: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "enabled": true, "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:03:45 EDT", "ActiveEnterTimestampMonotonic": "21263550", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:03:44 EDT", "AssertTimestampMonotonic": "20362942", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ConditionTimestampMonotonic": "20362941", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "686", "ExecMainStartTimestamp": "Thu 2024-09-19 15:03:44 EDT", "ExecMainStartTimestampMonotonic": "20363959", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:03:44 EDT] ; stop_time=[n/a] ; pid=686 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:03:44 EDT", "InactiveExitTimestampMonotonic": "20364018", "InvocationID": "7c00372bd2eb4de19310fd12d82bc5f0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "686", "MemoryAccounting": "yes", "MemoryCurrent": "18628608", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:03:45 EDT", "StateChangeTimestampMonotonic": "21263550", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:03:45 EDT", "WatchdogTimestampMonotonic": "21263547", "WatchdogUSec": "0" } } 8240 1726773032.47555: no more pending results, returning what we have 8240 1726773032.47558: results queue empty 8240 1726773032.47559: checking for any_errors_fatal 8240 1726773032.47564: done checking for any_errors_fatal 8240 1726773032.47564: checking for max_fail_percentage 8240 1726773032.47566: done checking for max_fail_percentage 8240 1726773032.47567: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.47567: done checking to see if all hosts have failed 8240 1726773032.47568: getting the remaining hosts for this loop 8240 1726773032.47569: done getting the remaining hosts for this loop 8240 1726773032.47573: getting the next task for host managed_node2 8240 1726773032.47577: done getting next task for host managed_node2 8240 1726773032.47580: ^ task is: TASK: Apply kernel_settings 8240 1726773032.47581: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.47584: getting variables 8240 1726773032.47587: in VariableManager get_vars() 8240 1726773032.47611: Calling all_inventory to load vars for managed_node2 8240 1726773032.47614: Calling groups_inventory to load vars for managed_node2 8240 1726773032.47617: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.47626: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.47628: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.47631: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.47787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.47977: done with get_vars() 8240 1726773032.47989: done getting variables TASK [Apply kernel_settings] *************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:57 Thursday 19 September 2024 15:10:32 -0400 (0:00:01.147) 0:00:11.124 **** 8240 1726773032.48073: entering _queue_task() for managed_node2/include_role 8240 1726773032.48075: Creating lock for include_role 8240 1726773032.48269: worker is 1 (out of 1 available) 8240 1726773032.48291: exiting _queue_task() for managed_node2/include_role 8240 1726773032.48306: done queuing things up, now waiting for results queue to drain 8240 1726773032.48308: waiting for pending results... 8571 1726773032.48478: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings 8571 1726773032.48606: in run() - task 0affffe7-6841-885f-bbcf-00000000000f 8571 1726773032.48626: variable 'ansible_search_path' from source: unknown 8571 1726773032.48661: calling self._execute() 8571 1726773032.48731: variable 'ansible_host' from source: host vars for 'managed_node2' 8571 1726773032.48741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8571 1726773032.48751: variable 'omit' from source: magic vars 8571 1726773032.48855: _execute() done 8571 1726773032.48862: dumping result to json 8571 1726773032.48870: done dumping result, returning 8571 1726773032.48875: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings [0affffe7-6841-885f-bbcf-00000000000f] 8571 1726773032.48881: sending task result for task 0affffe7-6841-885f-bbcf-00000000000f 8571 1726773032.48924: done sending task result for task 0affffe7-6841-885f-bbcf-00000000000f 8571 1726773032.48928: WORKER PROCESS EXITING 8240 1726773032.50193: no more pending results, returning what we have 8240 1726773032.50197: in VariableManager get_vars() 8240 1726773032.50225: Calling all_inventory to load vars for managed_node2 8240 1726773032.50228: Calling groups_inventory to load vars for managed_node2 8240 1726773032.50231: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.50241: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.50244: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.50247: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.50439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.50618: done with get_vars() 8240 1726773032.50626: variable 'ansible_search_path' from source: unknown 8240 1726773032.51348: variable 'omit' from source: magic vars 8240 1726773032.51365: variable 'omit' from source: magic vars 8240 1726773032.51376: variable 'omit' from source: magic vars 8240 1726773032.51378: we have included files to process 8240 1726773032.51379: generating all_blocks data 8240 1726773032.51380: done generating all_blocks data 8240 1726773032.51380: processing included file: fedora.linux_system_roles.kernel_settings 8240 1726773032.51402: in VariableManager get_vars() 8240 1726773032.51417: done with get_vars() 8240 1726773032.51490: in VariableManager get_vars() 8240 1726773032.51500: done with get_vars() 8240 1726773032.51526: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8240 1726773032.51737: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8240 1726773032.51778: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8240 1726773032.51855: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8240 1726773032.52334: in VariableManager get_vars() 8240 1726773032.52348: done with get_vars() 8240 1726773032.53217: in VariableManager get_vars() 8240 1726773032.53229: done with get_vars() 8240 1726773032.53387: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8240 1726773032.54029: iterating over new_blocks loaded from include file 8240 1726773032.54031: in VariableManager get_vars() 8240 1726773032.54046: done with get_vars() 8240 1726773032.54047: filtering new block on tags 8240 1726773032.54088: done filtering new block on tags 8240 1726773032.54091: in VariableManager get_vars() 8240 1726773032.54105: done with get_vars() 8240 1726773032.54106: filtering new block on tags 8240 1726773032.54140: done filtering new block on tags 8240 1726773032.54142: in VariableManager get_vars() 8240 1726773032.54158: done with get_vars() 8240 1726773032.54159: filtering new block on tags 8240 1726773032.54310: done filtering new block on tags 8240 1726773032.54313: in VariableManager get_vars() 8240 1726773032.54326: done with get_vars() 8240 1726773032.54327: filtering new block on tags 8240 1726773032.54344: done filtering new block on tags 8240 1726773032.54346: done iterating over new_blocks loaded from include file 8240 1726773032.54346: extending task lists for all hosts with included blocks 8240 1726773032.55038: done extending task lists 8240 1726773032.55039: done processing included files 8240 1726773032.55040: results queue empty 8240 1726773032.55041: checking for any_errors_fatal 8240 1726773032.55049: done checking for any_errors_fatal 8240 1726773032.55049: checking for max_fail_percentage 8240 1726773032.55050: done checking for max_fail_percentage 8240 1726773032.55050: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.55051: done checking to see if all hosts have failed 8240 1726773032.55052: getting the remaining hosts for this loop 8240 1726773032.55053: done getting the remaining hosts for this loop 8240 1726773032.55055: getting the next task for host managed_node2 8240 1726773032.55059: done getting next task for host managed_node2 8240 1726773032.55061: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8240 1726773032.55062: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.55074: getting variables 8240 1726773032.55074: in VariableManager get_vars() 8240 1726773032.55089: Calling all_inventory to load vars for managed_node2 8240 1726773032.55092: Calling groups_inventory to load vars for managed_node2 8240 1726773032.55094: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.55099: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.55101: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.55103: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.55240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.55432: done with get_vars() 8240 1726773032.55441: done getting variables 8240 1726773032.55519: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.074) 0:00:11.199 **** 8240 1726773032.55547: entering _queue_task() for managed_node2/fail 8240 1726773032.55549: Creating lock for fail 8240 1726773032.55774: worker is 1 (out of 1 available) 8240 1726773032.55791: exiting _queue_task() for managed_node2/fail 8240 1726773032.55804: done queuing things up, now waiting for results queue to drain 8240 1726773032.55805: waiting for pending results... 8576 1726773032.56017: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8576 1726773032.56151: in run() - task 0affffe7-6841-885f-bbcf-0000000000ad 8576 1726773032.56172: variable 'ansible_search_path' from source: unknown 8576 1726773032.56177: variable 'ansible_search_path' from source: unknown 8576 1726773032.56210: calling self._execute() 8576 1726773032.56280: variable 'ansible_host' from source: host vars for 'managed_node2' 8576 1726773032.56291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8576 1726773032.56299: variable 'omit' from source: magic vars 8576 1726773032.56750: variable 'kernel_settings_sysctl' from source: include params 8576 1726773032.56763: variable '__kernel_settings_state_empty' from source: role '' all vars 8576 1726773032.56771: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8576 1726773032.57008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8576 1726773032.58751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8576 1726773032.58827: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8576 1726773032.58862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8576 1726773032.58898: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8576 1726773032.58923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8576 1726773032.58996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8576 1726773032.59025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8576 1726773032.59054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8576 1726773032.59114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8576 1726773032.59127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8576 1726773032.59183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8576 1726773032.59208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8576 1726773032.59231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8576 1726773032.59269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8576 1726773032.59283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8576 1726773032.59328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8576 1726773032.59351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8576 1726773032.59373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8576 1726773032.59413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8576 1726773032.59427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8576 1726773032.59690: variable 'kernel_settings_sysctl' from source: include params 8576 1726773032.59745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8576 1726773032.59858: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8576 1726773032.59892: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8576 1726773032.59925: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8576 1726773032.59954: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8576 1726773032.59998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8576 1726773032.60020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8576 1726773032.60045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8576 1726773032.60070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8576 1726773032.60113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8576 1726773032.60133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8576 1726773032.60153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8576 1726773032.60176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8576 1726773032.60418: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 8576 1726773032.60424: when evaluation is False, skipping this task 8576 1726773032.60429: _execute() done 8576 1726773032.60432: dumping result to json 8576 1726773032.60439: done dumping result, returning 8576 1726773032.60444: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-885f-bbcf-0000000000ad] 8576 1726773032.60449: sending task result for task 0affffe7-6841-885f-bbcf-0000000000ad 8576 1726773032.60470: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000ad 8576 1726773032.60472: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8240 1726773032.60939: no more pending results, returning what we have 8240 1726773032.60942: results queue empty 8240 1726773032.60943: checking for any_errors_fatal 8240 1726773032.60945: done checking for any_errors_fatal 8240 1726773032.60946: checking for max_fail_percentage 8240 1726773032.60947: done checking for max_fail_percentage 8240 1726773032.60948: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.60948: done checking to see if all hosts have failed 8240 1726773032.60950: getting the remaining hosts for this loop 8240 1726773032.60952: done getting the remaining hosts for this loop 8240 1726773032.60954: getting the next task for host managed_node2 8240 1726773032.60959: done getting next task for host managed_node2 8240 1726773032.60962: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8240 1726773032.60964: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.60973: getting variables 8240 1726773032.60974: in VariableManager get_vars() 8240 1726773032.61008: Calling all_inventory to load vars for managed_node2 8240 1726773032.61011: Calling groups_inventory to load vars for managed_node2 8240 1726773032.61013: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.61021: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.61024: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.61026: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.61186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.61308: done with get_vars() 8240 1726773032.61316: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.058) 0:00:11.257 **** 8240 1726773032.61380: entering _queue_task() for managed_node2/include_tasks 8240 1726773032.61382: Creating lock for include_tasks 8240 1726773032.61538: worker is 1 (out of 1 available) 8240 1726773032.61553: exiting _queue_task() for managed_node2/include_tasks 8240 1726773032.61567: done queuing things up, now waiting for results queue to drain 8240 1726773032.61568: waiting for pending results... 8581 1726773032.61674: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8581 1726773032.61782: in run() - task 0affffe7-6841-885f-bbcf-0000000000ae 8581 1726773032.61801: variable 'ansible_search_path' from source: unknown 8581 1726773032.61805: variable 'ansible_search_path' from source: unknown 8581 1726773032.61832: calling self._execute() 8581 1726773032.61890: variable 'ansible_host' from source: host vars for 'managed_node2' 8581 1726773032.61898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8581 1726773032.61907: variable 'omit' from source: magic vars 8581 1726773032.61978: _execute() done 8581 1726773032.61984: dumping result to json 8581 1726773032.61989: done dumping result, returning 8581 1726773032.61996: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-885f-bbcf-0000000000ae] 8581 1726773032.62002: sending task result for task 0affffe7-6841-885f-bbcf-0000000000ae 8581 1726773032.62028: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000ae 8581 1726773032.62031: WORKER PROCESS EXITING 8240 1726773032.62153: no more pending results, returning what we have 8240 1726773032.62157: in VariableManager get_vars() 8240 1726773032.62186: Calling all_inventory to load vars for managed_node2 8240 1726773032.62189: Calling groups_inventory to load vars for managed_node2 8240 1726773032.62191: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.62197: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.62199: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.62201: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.62304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.62413: done with get_vars() 8240 1726773032.62419: variable 'ansible_search_path' from source: unknown 8240 1726773032.62419: variable 'ansible_search_path' from source: unknown 8240 1726773032.62442: we have included files to process 8240 1726773032.62443: generating all_blocks data 8240 1726773032.62444: done generating all_blocks data 8240 1726773032.62449: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773032.62450: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773032.62452: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8240 1726773032.63055: done processing included file 8240 1726773032.63058: iterating over new_blocks loaded from include file 8240 1726773032.63059: in VariableManager get_vars() 8240 1726773032.63075: done with get_vars() 8240 1726773032.63076: filtering new block on tags 8240 1726773032.63095: done filtering new block on tags 8240 1726773032.63097: in VariableManager get_vars() 8240 1726773032.63109: done with get_vars() 8240 1726773032.63111: filtering new block on tags 8240 1726773032.63132: done filtering new block on tags 8240 1726773032.63134: in VariableManager get_vars() 8240 1726773032.63156: done with get_vars() 8240 1726773032.63158: filtering new block on tags 8240 1726773032.63195: done filtering new block on tags 8240 1726773032.63197: in VariableManager get_vars() 8240 1726773032.63216: done with get_vars() 8240 1726773032.63217: filtering new block on tags 8240 1726773032.63238: done filtering new block on tags 8240 1726773032.63240: done iterating over new_blocks loaded from include file 8240 1726773032.63241: extending task lists for all hosts with included blocks 8240 1726773032.63415: done extending task lists 8240 1726773032.63416: done processing included files 8240 1726773032.63417: results queue empty 8240 1726773032.63417: checking for any_errors_fatal 8240 1726773032.63420: done checking for any_errors_fatal 8240 1726773032.63420: checking for max_fail_percentage 8240 1726773032.63421: done checking for max_fail_percentage 8240 1726773032.63421: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.63422: done checking to see if all hosts have failed 8240 1726773032.63422: getting the remaining hosts for this loop 8240 1726773032.63422: done getting the remaining hosts for this loop 8240 1726773032.63424: getting the next task for host managed_node2 8240 1726773032.63427: done getting next task for host managed_node2 8240 1726773032.63428: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8240 1726773032.63430: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.63435: getting variables 8240 1726773032.63436: in VariableManager get_vars() 8240 1726773032.63444: Calling all_inventory to load vars for managed_node2 8240 1726773032.63445: Calling groups_inventory to load vars for managed_node2 8240 1726773032.63446: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.63449: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.63450: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.63452: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.63538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.63651: done with get_vars() 8240 1726773032.63657: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.023) 0:00:11.280 **** 8240 1726773032.63706: entering _queue_task() for managed_node2/setup 8240 1726773032.63857: worker is 1 (out of 1 available) 8240 1726773032.63872: exiting _queue_task() for managed_node2/setup 8240 1726773032.63886: done queuing things up, now waiting for results queue to drain 8240 1726773032.63888: waiting for pending results... 8582 1726773032.64007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8582 1726773032.64156: in run() - task 0affffe7-6841-885f-bbcf-000000000159 8582 1726773032.64173: variable 'ansible_search_path' from source: unknown 8582 1726773032.64178: variable 'ansible_search_path' from source: unknown 8582 1726773032.64208: calling self._execute() 8582 1726773032.64279: variable 'ansible_host' from source: host vars for 'managed_node2' 8582 1726773032.64289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8582 1726773032.64297: variable 'omit' from source: magic vars 8582 1726773032.64743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8582 1726773032.66557: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8582 1726773032.66607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8582 1726773032.66638: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8582 1726773032.66664: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8582 1726773032.66687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8582 1726773032.66883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8582 1726773032.66907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8582 1726773032.66926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8582 1726773032.66955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8582 1726773032.66968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8582 1726773032.67008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8582 1726773032.67026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8582 1726773032.67043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8582 1726773032.67072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8582 1726773032.67086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8582 1726773032.67203: variable '__kernel_settings_required_facts' from source: role '' all vars 8582 1726773032.67214: variable 'ansible_facts' from source: unknown 8582 1726773032.67265: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8582 1726773032.67271: when evaluation is False, skipping this task 8582 1726773032.67275: _execute() done 8582 1726773032.67278: dumping result to json 8582 1726773032.67283: done dumping result, returning 8582 1726773032.67295: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-885f-bbcf-000000000159] 8582 1726773032.67300: sending task result for task 0affffe7-6841-885f-bbcf-000000000159 8582 1726773032.67320: done sending task result for task 0affffe7-6841-885f-bbcf-000000000159 8582 1726773032.67322: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8240 1726773032.67562: no more pending results, returning what we have 8240 1726773032.67566: results queue empty 8240 1726773032.67567: checking for any_errors_fatal 8240 1726773032.67569: done checking for any_errors_fatal 8240 1726773032.67569: checking for max_fail_percentage 8240 1726773032.67570: done checking for max_fail_percentage 8240 1726773032.67570: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.67571: done checking to see if all hosts have failed 8240 1726773032.67571: getting the remaining hosts for this loop 8240 1726773032.67572: done getting the remaining hosts for this loop 8240 1726773032.67575: getting the next task for host managed_node2 8240 1726773032.67580: done getting next task for host managed_node2 8240 1726773032.67583: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8240 1726773032.67587: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.67596: getting variables 8240 1726773032.67597: in VariableManager get_vars() 8240 1726773032.67623: Calling all_inventory to load vars for managed_node2 8240 1726773032.67625: Calling groups_inventory to load vars for managed_node2 8240 1726773032.67626: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.67632: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.67634: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.67636: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.67740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.67881: done with get_vars() 8240 1726773032.67890: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.042) 0:00:11.323 **** 8240 1726773032.67953: entering _queue_task() for managed_node2/stat 8240 1726773032.68140: worker is 1 (out of 1 available) 8240 1726773032.68153: exiting _queue_task() for managed_node2/stat 8240 1726773032.68167: done queuing things up, now waiting for results queue to drain 8240 1726773032.68168: waiting for pending results... 8584 1726773032.68372: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8584 1726773032.68513: in run() - task 0affffe7-6841-885f-bbcf-00000000015b 8584 1726773032.68531: variable 'ansible_search_path' from source: unknown 8584 1726773032.68535: variable 'ansible_search_path' from source: unknown 8584 1726773032.68566: calling self._execute() 8584 1726773032.68634: variable 'ansible_host' from source: host vars for 'managed_node2' 8584 1726773032.68643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8584 1726773032.68652: variable 'omit' from source: magic vars 8584 1726773032.69028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8584 1726773032.69205: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8584 1726773032.69237: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8584 1726773032.69262: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8584 1726773032.69295: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8584 1726773032.69352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8584 1726773032.69373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8584 1726773032.69398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8584 1726773032.69417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8584 1726773032.69500: variable '__kernel_settings_is_ostree' from source: set_fact 8584 1726773032.69512: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 8584 1726773032.69516: when evaluation is False, skipping this task 8584 1726773032.69520: _execute() done 8584 1726773032.69524: dumping result to json 8584 1726773032.69528: done dumping result, returning 8584 1726773032.69534: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-885f-bbcf-00000000015b] 8584 1726773032.69541: sending task result for task 0affffe7-6841-885f-bbcf-00000000015b 8584 1726773032.69564: done sending task result for task 0affffe7-6841-885f-bbcf-00000000015b 8584 1726773032.69567: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773032.69673: no more pending results, returning what we have 8240 1726773032.69676: results queue empty 8240 1726773032.69677: checking for any_errors_fatal 8240 1726773032.69683: done checking for any_errors_fatal 8240 1726773032.69683: checking for max_fail_percentage 8240 1726773032.69687: done checking for max_fail_percentage 8240 1726773032.69688: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.69688: done checking to see if all hosts have failed 8240 1726773032.69689: getting the remaining hosts for this loop 8240 1726773032.69690: done getting the remaining hosts for this loop 8240 1726773032.69693: getting the next task for host managed_node2 8240 1726773032.69699: done getting next task for host managed_node2 8240 1726773032.69702: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8240 1726773032.69705: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.69717: getting variables 8240 1726773032.69718: in VariableManager get_vars() 8240 1726773032.69745: Calling all_inventory to load vars for managed_node2 8240 1726773032.69747: Calling groups_inventory to load vars for managed_node2 8240 1726773032.69749: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.69756: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.69758: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.69760: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.69861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.69979: done with get_vars() 8240 1726773032.69987: done getting variables 8240 1726773032.70026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.020) 0:00:11.344 **** 8240 1726773032.70049: entering _queue_task() for managed_node2/set_fact 8240 1726773032.70209: worker is 1 (out of 1 available) 8240 1726773032.70222: exiting _queue_task() for managed_node2/set_fact 8240 1726773032.70235: done queuing things up, now waiting for results queue to drain 8240 1726773032.70237: waiting for pending results... 8586 1726773032.70338: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8586 1726773032.70450: in run() - task 0affffe7-6841-885f-bbcf-00000000015c 8586 1726773032.70465: variable 'ansible_search_path' from source: unknown 8586 1726773032.70470: variable 'ansible_search_path' from source: unknown 8586 1726773032.70498: calling self._execute() 8586 1726773032.70551: variable 'ansible_host' from source: host vars for 'managed_node2' 8586 1726773032.70559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8586 1726773032.70568: variable 'omit' from source: magic vars 8586 1726773032.70894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8586 1726773032.71118: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8586 1726773032.71153: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8586 1726773032.71180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8586 1726773032.71207: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8586 1726773032.71267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8586 1726773032.71290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8586 1726773032.71308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8586 1726773032.71326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8586 1726773032.71409: variable '__kernel_settings_is_ostree' from source: set_fact 8586 1726773032.71419: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 8586 1726773032.71422: when evaluation is False, skipping this task 8586 1726773032.71424: _execute() done 8586 1726773032.71426: dumping result to json 8586 1726773032.71429: done dumping result, returning 8586 1726773032.71433: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-00000000015c] 8586 1726773032.71436: sending task result for task 0affffe7-6841-885f-bbcf-00000000015c 8586 1726773032.71457: done sending task result for task 0affffe7-6841-885f-bbcf-00000000015c 8586 1726773032.71459: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773032.71686: no more pending results, returning what we have 8240 1726773032.71690: results queue empty 8240 1726773032.71691: checking for any_errors_fatal 8240 1726773032.71694: done checking for any_errors_fatal 8240 1726773032.71695: checking for max_fail_percentage 8240 1726773032.71696: done checking for max_fail_percentage 8240 1726773032.71696: checking to see if all hosts have failed and the running result is not ok 8240 1726773032.71697: done checking to see if all hosts have failed 8240 1726773032.71697: getting the remaining hosts for this loop 8240 1726773032.71698: done getting the remaining hosts for this loop 8240 1726773032.71700: getting the next task for host managed_node2 8240 1726773032.71708: done getting next task for host managed_node2 8240 1726773032.71710: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8240 1726773032.71712: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773032.71721: getting variables 8240 1726773032.71722: in VariableManager get_vars() 8240 1726773032.71745: Calling all_inventory to load vars for managed_node2 8240 1726773032.71747: Calling groups_inventory to load vars for managed_node2 8240 1726773032.71748: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773032.71754: Calling all_plugins_play to load vars for managed_node2 8240 1726773032.71755: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773032.71757: Calling groups_plugins_play to load vars for managed_node2 8240 1726773032.71897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773032.72014: done with get_vars() 8240 1726773032.72020: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.020) 0:00:11.364 **** 8240 1726773032.72087: entering _queue_task() for managed_node2/stat 8240 1726773032.72240: worker is 1 (out of 1 available) 8240 1726773032.72250: exiting _queue_task() for managed_node2/stat 8240 1726773032.72262: done queuing things up, now waiting for results queue to drain 8240 1726773032.72266: waiting for pending results... 8587 1726773032.72369: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8587 1726773032.72487: in run() - task 0affffe7-6841-885f-bbcf-00000000015e 8587 1726773032.72503: variable 'ansible_search_path' from source: unknown 8587 1726773032.72506: variable 'ansible_search_path' from source: unknown 8587 1726773032.72533: calling self._execute() 8587 1726773032.72589: variable 'ansible_host' from source: host vars for 'managed_node2' 8587 1726773032.72598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8587 1726773032.72606: variable 'omit' from source: magic vars 8587 1726773032.72932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8587 1726773032.73104: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8587 1726773032.73139: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8587 1726773032.73168: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8587 1726773032.73201: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8587 1726773032.73260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8587 1726773032.73281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8587 1726773032.73300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8587 1726773032.73316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8587 1726773032.73413: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8587 1726773032.73421: variable 'omit' from source: magic vars 8587 1726773032.73461: variable 'omit' from source: magic vars 8587 1726773032.73488: variable 'omit' from source: magic vars 8587 1726773032.73508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8587 1726773032.73529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8587 1726773032.73544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8587 1726773032.73558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8587 1726773032.73567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8587 1726773032.73592: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8587 1726773032.73597: variable 'ansible_host' from source: host vars for 'managed_node2' 8587 1726773032.73600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8587 1726773032.73660: Set connection var ansible_pipelining to False 8587 1726773032.73666: Set connection var ansible_timeout to 10 8587 1726773032.73672: Set connection var ansible_module_compression to ZIP_DEFLATED 8587 1726773032.73675: Set connection var ansible_shell_type to sh 8587 1726773032.73678: Set connection var ansible_shell_executable to /bin/sh 8587 1726773032.73680: Set connection var ansible_connection to ssh 8587 1726773032.73696: variable 'ansible_shell_executable' from source: unknown 8587 1726773032.73700: variable 'ansible_connection' from source: unknown 8587 1726773032.73702: variable 'ansible_module_compression' from source: unknown 8587 1726773032.73705: variable 'ansible_shell_type' from source: unknown 8587 1726773032.73706: variable 'ansible_shell_executable' from source: unknown 8587 1726773032.73708: variable 'ansible_host' from source: host vars for 'managed_node2' 8587 1726773032.73710: variable 'ansible_pipelining' from source: unknown 8587 1726773032.73712: variable 'ansible_timeout' from source: unknown 8587 1726773032.73714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8587 1726773032.73801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8587 1726773032.73811: variable 'omit' from source: magic vars 8587 1726773032.73816: starting attempt loop 8587 1726773032.73818: running the handler 8587 1726773032.73828: _low_level_execute_command(): starting 8587 1726773032.73833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8587 1726773032.76302: stdout chunk (state=2): >>>/root <<< 8587 1726773032.76426: stderr chunk (state=3): >>><<< 8587 1726773032.76434: stdout chunk (state=3): >>><<< 8587 1726773032.76453: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8587 1726773032.76468: _low_level_execute_command(): starting 8587 1726773032.76475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291 `" && echo ansible-tmp-1726773032.7646124-8587-205559027572291="` echo /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291 `" ) && sleep 0' 8587 1726773032.78949: stdout chunk (state=2): >>>ansible-tmp-1726773032.7646124-8587-205559027572291=/root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291 <<< 8587 1726773032.79079: stderr chunk (state=3): >>><<< 8587 1726773032.79089: stdout chunk (state=3): >>><<< 8587 1726773032.79106: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773032.7646124-8587-205559027572291=/root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291 , stderr= 8587 1726773032.79145: variable 'ansible_module_compression' from source: unknown 8587 1726773032.79197: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8587 1726773032.79224: variable 'ansible_facts' from source: unknown 8587 1726773032.79296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/AnsiballZ_stat.py 8587 1726773032.79400: Sending initial data 8587 1726773032.79407: Sent initial data (151 bytes) 8587 1726773032.82013: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpll972p7r /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/AnsiballZ_stat.py <<< 8587 1726773032.83180: stderr chunk (state=3): >>><<< 8587 1726773032.83192: stdout chunk (state=3): >>><<< 8587 1726773032.83213: done transferring module to remote 8587 1726773032.83224: _low_level_execute_command(): starting 8587 1726773032.83230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/ /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/AnsiballZ_stat.py && sleep 0' 8587 1726773032.85823: stderr chunk (state=2): >>><<< 8587 1726773032.85833: stdout chunk (state=2): >>><<< 8587 1726773032.85849: _low_level_execute_command() done: rc=0, stdout=, stderr= 8587 1726773032.85854: _low_level_execute_command(): starting 8587 1726773032.85859: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/AnsiballZ_stat.py && sleep 0' 8587 1726773033.00999: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8587 1726773033.02097: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8587 1726773033.02114: stdout chunk (state=3): >>><<< 8587 1726773033.02126: stderr chunk (state=3): >>><<< 8587 1726773033.02139: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 8587 1726773033.02215: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8587 1726773033.02227: _low_level_execute_command(): starting 8587 1726773033.02232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773032.7646124-8587-205559027572291/ > /dev/null 2>&1 && sleep 0' 8587 1726773033.04948: stderr chunk (state=2): >>><<< 8587 1726773033.04960: stdout chunk (state=2): >>><<< 8587 1726773033.04981: _low_level_execute_command() done: rc=0, stdout=, stderr= 8587 1726773033.04991: handler run complete 8587 1726773033.05014: attempt loop complete, returning result 8587 1726773033.05018: _execute() done 8587 1726773033.05021: dumping result to json 8587 1726773033.05025: done dumping result, returning 8587 1726773033.05033: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-885f-bbcf-00000000015e] 8587 1726773033.05039: sending task result for task 0affffe7-6841-885f-bbcf-00000000015e 8587 1726773033.05078: done sending task result for task 0affffe7-6841-885f-bbcf-00000000015e 8587 1726773033.05082: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 8240 1726773033.05488: no more pending results, returning what we have 8240 1726773033.05492: results queue empty 8240 1726773033.05492: checking for any_errors_fatal 8240 1726773033.05497: done checking for any_errors_fatal 8240 1726773033.05497: checking for max_fail_percentage 8240 1726773033.05498: done checking for max_fail_percentage 8240 1726773033.05499: checking to see if all hosts have failed and the running result is not ok 8240 1726773033.05500: done checking to see if all hosts have failed 8240 1726773033.05500: getting the remaining hosts for this loop 8240 1726773033.05502: done getting the remaining hosts for this loop 8240 1726773033.05505: getting the next task for host managed_node2 8240 1726773033.05510: done getting next task for host managed_node2 8240 1726773033.05513: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8240 1726773033.05517: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773033.05526: getting variables 8240 1726773033.05527: in VariableManager get_vars() 8240 1726773033.05560: Calling all_inventory to load vars for managed_node2 8240 1726773033.05563: Calling groups_inventory to load vars for managed_node2 8240 1726773033.05568: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773033.05577: Calling all_plugins_play to load vars for managed_node2 8240 1726773033.05580: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773033.05582: Calling groups_plugins_play to load vars for managed_node2 8240 1726773033.05747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773033.05955: done with get_vars() 8240 1726773033.05969: done getting variables 8240 1726773033.06026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.339) 0:00:11.704 **** 8240 1726773033.06058: entering _queue_task() for managed_node2/set_fact 8240 1726773033.06260: worker is 1 (out of 1 available) 8240 1726773033.06274: exiting _queue_task() for managed_node2/set_fact 8240 1726773033.06288: done queuing things up, now waiting for results queue to drain 8240 1726773033.06290: waiting for pending results... 8602 1726773033.06584: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8602 1726773033.06745: in run() - task 0affffe7-6841-885f-bbcf-00000000015f 8602 1726773033.06763: variable 'ansible_search_path' from source: unknown 8602 1726773033.06770: variable 'ansible_search_path' from source: unknown 8602 1726773033.06803: calling self._execute() 8602 1726773033.06877: variable 'ansible_host' from source: host vars for 'managed_node2' 8602 1726773033.06888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8602 1726773033.06898: variable 'omit' from source: magic vars 8602 1726773033.07353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8602 1726773033.07661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8602 1726773033.07709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8602 1726773033.07741: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8602 1726773033.07775: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8602 1726773033.07846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8602 1726773033.07872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8602 1726773033.07898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8602 1726773033.07921: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8602 1726773033.08035: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8602 1726773033.08044: variable 'omit' from source: magic vars 8602 1726773033.08105: variable 'omit' from source: magic vars 8602 1726773033.08221: variable '__transactional_update_stat' from source: set_fact 8602 1726773033.08269: variable 'omit' from source: magic vars 8602 1726773033.08295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8602 1726773033.08322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8602 1726773033.08342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8602 1726773033.08359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8602 1726773033.08372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8602 1726773033.08403: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8602 1726773033.08409: variable 'ansible_host' from source: host vars for 'managed_node2' 8602 1726773033.08413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8602 1726773033.08563: Set connection var ansible_pipelining to False 8602 1726773033.08575: Set connection var ansible_timeout to 10 8602 1726773033.08583: Set connection var ansible_module_compression to ZIP_DEFLATED 8602 1726773033.08588: Set connection var ansible_shell_type to sh 8602 1726773033.08594: Set connection var ansible_shell_executable to /bin/sh 8602 1726773033.08600: Set connection var ansible_connection to ssh 8602 1726773033.08620: variable 'ansible_shell_executable' from source: unknown 8602 1726773033.08625: variable 'ansible_connection' from source: unknown 8602 1726773033.08629: variable 'ansible_module_compression' from source: unknown 8602 1726773033.08632: variable 'ansible_shell_type' from source: unknown 8602 1726773033.08635: variable 'ansible_shell_executable' from source: unknown 8602 1726773033.08638: variable 'ansible_host' from source: host vars for 'managed_node2' 8602 1726773033.08642: variable 'ansible_pipelining' from source: unknown 8602 1726773033.08645: variable 'ansible_timeout' from source: unknown 8602 1726773033.08648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8602 1726773033.08740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8602 1726773033.08751: variable 'omit' from source: magic vars 8602 1726773033.08757: starting attempt loop 8602 1726773033.08760: running the handler 8602 1726773033.08772: handler run complete 8602 1726773033.08781: attempt loop complete, returning result 8602 1726773033.08784: _execute() done 8602 1726773033.08789: dumping result to json 8602 1726773033.08792: done dumping result, returning 8602 1726773033.08798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-885f-bbcf-00000000015f] 8602 1726773033.08803: sending task result for task 0affffe7-6841-885f-bbcf-00000000015f 8602 1726773033.08826: done sending task result for task 0affffe7-6841-885f-bbcf-00000000015f 8602 1726773033.08830: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8240 1726773033.09240: no more pending results, returning what we have 8240 1726773033.09243: results queue empty 8240 1726773033.09244: checking for any_errors_fatal 8240 1726773033.09251: done checking for any_errors_fatal 8240 1726773033.09252: checking for max_fail_percentage 8240 1726773033.09253: done checking for max_fail_percentage 8240 1726773033.09254: checking to see if all hosts have failed and the running result is not ok 8240 1726773033.09254: done checking to see if all hosts have failed 8240 1726773033.09255: getting the remaining hosts for this loop 8240 1726773033.09256: done getting the remaining hosts for this loop 8240 1726773033.09259: getting the next task for host managed_node2 8240 1726773033.09269: done getting next task for host managed_node2 8240 1726773033.09272: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8240 1726773033.09275: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773033.09284: getting variables 8240 1726773033.09292: in VariableManager get_vars() 8240 1726773033.09321: Calling all_inventory to load vars for managed_node2 8240 1726773033.09324: Calling groups_inventory to load vars for managed_node2 8240 1726773033.09325: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773033.09334: Calling all_plugins_play to load vars for managed_node2 8240 1726773033.09336: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773033.09338: Calling groups_plugins_play to load vars for managed_node2 8240 1726773033.09550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773033.09744: done with get_vars() 8240 1726773033.09754: done getting variables 8240 1726773033.09858: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.038) 0:00:11.742 **** 8240 1726773033.09894: entering _queue_task() for managed_node2/include_vars 8240 1726773033.09896: Creating lock for include_vars 8240 1726773033.10094: worker is 1 (out of 1 available) 8240 1726773033.10105: exiting _queue_task() for managed_node2/include_vars 8240 1726773033.10118: done queuing things up, now waiting for results queue to drain 8240 1726773033.10119: waiting for pending results... 8603 1726773033.10326: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8603 1726773033.10479: in run() - task 0affffe7-6841-885f-bbcf-000000000161 8603 1726773033.10498: variable 'ansible_search_path' from source: unknown 8603 1726773033.10503: variable 'ansible_search_path' from source: unknown 8603 1726773033.10533: calling self._execute() 8603 1726773033.10611: variable 'ansible_host' from source: host vars for 'managed_node2' 8603 1726773033.10620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8603 1726773033.10629: variable 'omit' from source: magic vars 8603 1726773033.10728: variable 'omit' from source: magic vars 8603 1726773033.10794: variable 'omit' from source: magic vars 8603 1726773033.11147: variable 'ffparams' from source: task vars 8603 1726773033.11331: variable 'ansible_facts' from source: unknown 8603 1726773033.11524: variable 'ansible_facts' from source: unknown 8603 1726773033.11657: variable 'ansible_facts' from source: unknown 8603 1726773033.11792: variable 'ansible_facts' from source: unknown 8603 1726773033.11905: variable 'role_path' from source: magic vars 8603 1726773033.12055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8603 1726773033.12517: Loaded config def from plugin (lookup/first_found) 8603 1726773033.12526: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 8603 1726773033.12561: variable 'ansible_search_path' from source: unknown 8603 1726773033.12588: variable 'ansible_search_path' from source: unknown 8603 1726773033.12599: variable 'ansible_search_path' from source: unknown 8603 1726773033.12607: variable 'ansible_search_path' from source: unknown 8603 1726773033.12615: variable 'ansible_search_path' from source: unknown 8603 1726773033.12634: variable 'omit' from source: magic vars 8603 1726773033.12657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8603 1726773033.12684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8603 1726773033.12704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8603 1726773033.12720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8603 1726773033.12730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8603 1726773033.12757: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8603 1726773033.12763: variable 'ansible_host' from source: host vars for 'managed_node2' 8603 1726773033.12770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8603 1726773033.12867: Set connection var ansible_pipelining to False 8603 1726773033.12876: Set connection var ansible_timeout to 10 8603 1726773033.12884: Set connection var ansible_module_compression to ZIP_DEFLATED 8603 1726773033.12891: Set connection var ansible_shell_type to sh 8603 1726773033.12896: Set connection var ansible_shell_executable to /bin/sh 8603 1726773033.12902: Set connection var ansible_connection to ssh 8603 1726773033.12923: variable 'ansible_shell_executable' from source: unknown 8603 1726773033.12928: variable 'ansible_connection' from source: unknown 8603 1726773033.12932: variable 'ansible_module_compression' from source: unknown 8603 1726773033.12935: variable 'ansible_shell_type' from source: unknown 8603 1726773033.12938: variable 'ansible_shell_executable' from source: unknown 8603 1726773033.12941: variable 'ansible_host' from source: host vars for 'managed_node2' 8603 1726773033.12945: variable 'ansible_pipelining' from source: unknown 8603 1726773033.12948: variable 'ansible_timeout' from source: unknown 8603 1726773033.12952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8603 1726773033.13053: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8603 1726773033.13067: variable 'omit' from source: magic vars 8603 1726773033.13074: starting attempt loop 8603 1726773033.13077: running the handler 8603 1726773033.13128: handler run complete 8603 1726773033.13141: attempt loop complete, returning result 8603 1726773033.13145: _execute() done 8603 1726773033.13148: dumping result to json 8603 1726773033.13151: done dumping result, returning 8603 1726773033.13158: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-885f-bbcf-000000000161] 8603 1726773033.13163: sending task result for task 0affffe7-6841-885f-bbcf-000000000161 8603 1726773033.13197: done sending task result for task 0affffe7-6841-885f-bbcf-000000000161 8603 1726773033.13200: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8240 1726773033.13638: no more pending results, returning what we have 8240 1726773033.13641: results queue empty 8240 1726773033.13642: checking for any_errors_fatal 8240 1726773033.13646: done checking for any_errors_fatal 8240 1726773033.13647: checking for max_fail_percentage 8240 1726773033.13648: done checking for max_fail_percentage 8240 1726773033.13649: checking to see if all hosts have failed and the running result is not ok 8240 1726773033.13650: done checking to see if all hosts have failed 8240 1726773033.13650: getting the remaining hosts for this loop 8240 1726773033.13652: done getting the remaining hosts for this loop 8240 1726773033.13655: getting the next task for host managed_node2 8240 1726773033.13662: done getting next task for host managed_node2 8240 1726773033.13669: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8240 1726773033.13671: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773033.13682: getting variables 8240 1726773033.13683: in VariableManager get_vars() 8240 1726773033.13716: Calling all_inventory to load vars for managed_node2 8240 1726773033.13719: Calling groups_inventory to load vars for managed_node2 8240 1726773033.13721: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773033.13729: Calling all_plugins_play to load vars for managed_node2 8240 1726773033.13732: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773033.13735: Calling groups_plugins_play to load vars for managed_node2 8240 1726773033.13903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773033.14110: done with get_vars() 8240 1726773033.14121: done getting variables 8240 1726773033.14175: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.043) 0:00:11.785 **** 8240 1726773033.14206: entering _queue_task() for managed_node2/package 8240 1726773033.14407: worker is 1 (out of 1 available) 8240 1726773033.14418: exiting _queue_task() for managed_node2/package 8240 1726773033.14429: done queuing things up, now waiting for results queue to drain 8240 1726773033.14430: waiting for pending results... 8604 1726773033.14650: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8604 1726773033.14794: in run() - task 0affffe7-6841-885f-bbcf-0000000000af 8604 1726773033.14812: variable 'ansible_search_path' from source: unknown 8604 1726773033.14817: variable 'ansible_search_path' from source: unknown 8604 1726773033.14849: calling self._execute() 8604 1726773033.14926: variable 'ansible_host' from source: host vars for 'managed_node2' 8604 1726773033.14935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8604 1726773033.14944: variable 'omit' from source: magic vars 8604 1726773033.15054: variable 'omit' from source: magic vars 8604 1726773033.15104: variable 'omit' from source: magic vars 8604 1726773033.15130: variable '__kernel_settings_packages' from source: include_vars 8604 1726773033.15394: variable '__kernel_settings_packages' from source: include_vars 8604 1726773033.15907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8604 1726773033.18237: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8604 1726773033.18310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8604 1726773033.18361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8604 1726773033.18400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8604 1726773033.18426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8604 1726773033.18523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8604 1726773033.18550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8604 1726773033.18578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8604 1726773033.18618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8604 1726773033.18633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8604 1726773033.18739: variable '__kernel_settings_is_ostree' from source: set_fact 8604 1726773033.18747: variable 'omit' from source: magic vars 8604 1726773033.18781: variable 'omit' from source: magic vars 8604 1726773033.19384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8604 1726773033.19414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8604 1726773033.19434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8604 1726773033.19452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8604 1726773033.19467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8604 1726773033.19503: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8604 1726773033.19510: variable 'ansible_host' from source: host vars for 'managed_node2' 8604 1726773033.19515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8604 1726773033.19615: Set connection var ansible_pipelining to False 8604 1726773033.19623: Set connection var ansible_timeout to 10 8604 1726773033.19632: Set connection var ansible_module_compression to ZIP_DEFLATED 8604 1726773033.19635: Set connection var ansible_shell_type to sh 8604 1726773033.19640: Set connection var ansible_shell_executable to /bin/sh 8604 1726773033.19645: Set connection var ansible_connection to ssh 8604 1726773033.19672: variable 'ansible_shell_executable' from source: unknown 8604 1726773033.19677: variable 'ansible_connection' from source: unknown 8604 1726773033.19680: variable 'ansible_module_compression' from source: unknown 8604 1726773033.19683: variable 'ansible_shell_type' from source: unknown 8604 1726773033.19687: variable 'ansible_shell_executable' from source: unknown 8604 1726773033.19690: variable 'ansible_host' from source: host vars for 'managed_node2' 8604 1726773033.19693: variable 'ansible_pipelining' from source: unknown 8604 1726773033.19696: variable 'ansible_timeout' from source: unknown 8604 1726773033.19700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8604 1726773033.19793: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8604 1726773033.19806: variable 'omit' from source: magic vars 8604 1726773033.19813: starting attempt loop 8604 1726773033.19816: running the handler 8604 1726773033.19910: variable 'ansible_facts' from source: unknown 8604 1726773033.20027: _low_level_execute_command(): starting 8604 1726773033.20036: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8604 1726773033.23177: stdout chunk (state=2): >>>/root <<< 8604 1726773033.23304: stderr chunk (state=3): >>><<< 8604 1726773033.23312: stdout chunk (state=3): >>><<< 8604 1726773033.23332: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8604 1726773033.23347: _low_level_execute_command(): starting 8604 1726773033.23354: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946 `" && echo ansible-tmp-1726773033.2334065-8604-46871663345946="` echo /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946 `" ) && sleep 0' 8604 1726773033.25924: stdout chunk (state=2): >>>ansible-tmp-1726773033.2334065-8604-46871663345946=/root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946 <<< 8604 1726773033.26083: stderr chunk (state=3): >>><<< 8604 1726773033.26094: stdout chunk (state=3): >>><<< 8604 1726773033.26116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.2334065-8604-46871663345946=/root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946 , stderr= 8604 1726773033.26147: variable 'ansible_module_compression' from source: unknown 8604 1726773033.26205: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 8604 1726773033.26251: variable 'ansible_facts' from source: unknown 8604 1726773033.26374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/AnsiballZ_dnf.py 8604 1726773033.27194: Sending initial data 8604 1726773033.27202: Sent initial data (149 bytes) 8604 1726773033.30250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp8aq2ti3v /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/AnsiballZ_dnf.py <<< 8604 1726773033.32097: stderr chunk (state=3): >>><<< 8604 1726773033.32106: stdout chunk (state=3): >>><<< 8604 1726773033.32125: done transferring module to remote 8604 1726773033.32135: _low_level_execute_command(): starting 8604 1726773033.32139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/ /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/AnsiballZ_dnf.py && sleep 0' 8604 1726773033.34532: stderr chunk (state=2): >>><<< 8604 1726773033.34541: stdout chunk (state=2): >>><<< 8604 1726773033.34555: _low_level_execute_command() done: rc=0, stdout=, stderr= 8604 1726773033.34558: _low_level_execute_command(): starting 8604 1726773033.34562: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/AnsiballZ_dnf.py && sleep 0' 8604 1726773035.91089: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8604 1726773035.98589: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8604 1726773035.98635: stderr chunk (state=3): >>><<< 8604 1726773035.98643: stdout chunk (state=3): >>><<< 8604 1726773035.98663: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8604 1726773035.98700: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8604 1726773035.98709: _low_level_execute_command(): starting 8604 1726773035.98715: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.2334065-8604-46871663345946/ > /dev/null 2>&1 && sleep 0' 8604 1726773036.01186: stderr chunk (state=2): >>><<< 8604 1726773036.01197: stdout chunk (state=2): >>><<< 8604 1726773036.01213: _low_level_execute_command() done: rc=0, stdout=, stderr= 8604 1726773036.01221: handler run complete 8604 1726773036.01247: attempt loop complete, returning result 8604 1726773036.01251: _execute() done 8604 1726773036.01254: dumping result to json 8604 1726773036.01260: done dumping result, returning 8604 1726773036.01269: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-885f-bbcf-0000000000af] 8604 1726773036.01277: sending task result for task 0affffe7-6841-885f-bbcf-0000000000af 8604 1726773036.01309: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000af 8604 1726773036.01313: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773036.01480: no more pending results, returning what we have 8240 1726773036.01483: results queue empty 8240 1726773036.01484: checking for any_errors_fatal 8240 1726773036.01492: done checking for any_errors_fatal 8240 1726773036.01492: checking for max_fail_percentage 8240 1726773036.01493: done checking for max_fail_percentage 8240 1726773036.01494: checking to see if all hosts have failed and the running result is not ok 8240 1726773036.01495: done checking to see if all hosts have failed 8240 1726773036.01497: getting the remaining hosts for this loop 8240 1726773036.01498: done getting the remaining hosts for this loop 8240 1726773036.01501: getting the next task for host managed_node2 8240 1726773036.01509: done getting next task for host managed_node2 8240 1726773036.01512: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8240 1726773036.01514: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773036.01523: getting variables 8240 1726773036.01524: in VariableManager get_vars() 8240 1726773036.01555: Calling all_inventory to load vars for managed_node2 8240 1726773036.01558: Calling groups_inventory to load vars for managed_node2 8240 1726773036.01559: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773036.01568: Calling all_plugins_play to load vars for managed_node2 8240 1726773036.01570: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773036.01573: Calling groups_plugins_play to load vars for managed_node2 8240 1726773036.01839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773036.01975: done with get_vars() 8240 1726773036.01983: done getting variables 8240 1726773036.02055: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:10:36 -0400 (0:00:02.878) 0:00:14.664 **** 8240 1726773036.02078: entering _queue_task() for managed_node2/debug 8240 1726773036.02079: Creating lock for debug 8240 1726773036.02253: worker is 1 (out of 1 available) 8240 1726773036.02267: exiting _queue_task() for managed_node2/debug 8240 1726773036.02280: done queuing things up, now waiting for results queue to drain 8240 1726773036.02282: waiting for pending results... 8737 1726773036.02398: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8737 1726773036.02517: in run() - task 0affffe7-6841-885f-bbcf-0000000000b1 8737 1726773036.02533: variable 'ansible_search_path' from source: unknown 8737 1726773036.02537: variable 'ansible_search_path' from source: unknown 8737 1726773036.02566: calling self._execute() 8737 1726773036.02629: variable 'ansible_host' from source: host vars for 'managed_node2' 8737 1726773036.02637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8737 1726773036.02647: variable 'omit' from source: magic vars 8737 1726773036.03004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8737 1726773036.04934: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8737 1726773036.04991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8737 1726773036.05018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8737 1726773036.05042: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8737 1726773036.05060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8737 1726773036.05120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8737 1726773036.05139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8737 1726773036.05155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8737 1726773036.05190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8737 1726773036.05204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8737 1726773036.05288: variable '__kernel_settings_is_transactional' from source: set_fact 8737 1726773036.05303: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8737 1726773036.05307: when evaluation is False, skipping this task 8737 1726773036.05310: _execute() done 8737 1726773036.05312: dumping result to json 8737 1726773036.05314: done dumping result, returning 8737 1726773036.05319: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-0000000000b1] 8737 1726773036.05323: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b1 8737 1726773036.05345: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b1 8737 1726773036.05347: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8240 1726773036.05631: no more pending results, returning what we have 8240 1726773036.05633: results queue empty 8240 1726773036.05634: checking for any_errors_fatal 8240 1726773036.05640: done checking for any_errors_fatal 8240 1726773036.05640: checking for max_fail_percentage 8240 1726773036.05641: done checking for max_fail_percentage 8240 1726773036.05641: checking to see if all hosts have failed and the running result is not ok 8240 1726773036.05642: done checking to see if all hosts have failed 8240 1726773036.05643: getting the remaining hosts for this loop 8240 1726773036.05643: done getting the remaining hosts for this loop 8240 1726773036.05646: getting the next task for host managed_node2 8240 1726773036.05651: done getting next task for host managed_node2 8240 1726773036.05653: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8240 1726773036.05655: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773036.05664: getting variables 8240 1726773036.05665: in VariableManager get_vars() 8240 1726773036.05691: Calling all_inventory to load vars for managed_node2 8240 1726773036.05693: Calling groups_inventory to load vars for managed_node2 8240 1726773036.05694: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773036.05701: Calling all_plugins_play to load vars for managed_node2 8240 1726773036.05703: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773036.05704: Calling groups_plugins_play to load vars for managed_node2 8240 1726773036.05813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773036.05939: done with get_vars() 8240 1726773036.05948: done getting variables 8240 1726773036.06044: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.039) 0:00:14.704 **** 8240 1726773036.06068: entering _queue_task() for managed_node2/reboot 8240 1726773036.06070: Creating lock for reboot 8240 1726773036.06251: worker is 1 (out of 1 available) 8240 1726773036.06264: exiting _queue_task() for managed_node2/reboot 8240 1726773036.06276: done queuing things up, now waiting for results queue to drain 8240 1726773036.06278: waiting for pending results... 8739 1726773036.06419: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8739 1726773036.06552: in run() - task 0affffe7-6841-885f-bbcf-0000000000b2 8739 1726773036.06572: variable 'ansible_search_path' from source: unknown 8739 1726773036.06576: variable 'ansible_search_path' from source: unknown 8739 1726773036.06607: calling self._execute() 8739 1726773036.06683: variable 'ansible_host' from source: host vars for 'managed_node2' 8739 1726773036.06693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8739 1726773036.06701: variable 'omit' from source: magic vars 8739 1726773036.07146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8739 1726773036.09190: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8739 1726773036.09271: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8739 1726773036.09312: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8739 1726773036.09347: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8739 1726773036.09377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8739 1726773036.09451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8739 1726773036.09483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8739 1726773036.09541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8739 1726773036.09583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8739 1726773036.09602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8739 1726773036.09712: variable '__kernel_settings_is_transactional' from source: set_fact 8739 1726773036.09732: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8739 1726773036.09737: when evaluation is False, skipping this task 8739 1726773036.09741: _execute() done 8739 1726773036.09745: dumping result to json 8739 1726773036.09748: done dumping result, returning 8739 1726773036.09755: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-885f-bbcf-0000000000b2] 8739 1726773036.09761: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b2 8739 1726773036.09798: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b2 8739 1726773036.09802: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773036.10208: no more pending results, returning what we have 8240 1726773036.10211: results queue empty 8240 1726773036.10212: checking for any_errors_fatal 8240 1726773036.10217: done checking for any_errors_fatal 8240 1726773036.10218: checking for max_fail_percentage 8240 1726773036.10219: done checking for max_fail_percentage 8240 1726773036.10219: checking to see if all hosts have failed and the running result is not ok 8240 1726773036.10220: done checking to see if all hosts have failed 8240 1726773036.10221: getting the remaining hosts for this loop 8240 1726773036.10222: done getting the remaining hosts for this loop 8240 1726773036.10226: getting the next task for host managed_node2 8240 1726773036.10232: done getting next task for host managed_node2 8240 1726773036.10236: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8240 1726773036.10238: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773036.10251: getting variables 8240 1726773036.10252: in VariableManager get_vars() 8240 1726773036.10350: Calling all_inventory to load vars for managed_node2 8240 1726773036.10354: Calling groups_inventory to load vars for managed_node2 8240 1726773036.10356: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773036.10368: Calling all_plugins_play to load vars for managed_node2 8240 1726773036.10371: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773036.10374: Calling groups_plugins_play to load vars for managed_node2 8240 1726773036.10530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773036.10700: done with get_vars() 8240 1726773036.10709: done getting variables 8240 1726773036.10752: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.047) 0:00:14.751 **** 8240 1726773036.10776: entering _queue_task() for managed_node2/fail 8240 1726773036.10952: worker is 1 (out of 1 available) 8240 1726773036.10977: exiting _queue_task() for managed_node2/fail 8240 1726773036.10992: done queuing things up, now waiting for results queue to drain 8240 1726773036.10993: waiting for pending results... 8742 1726773036.11107: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8742 1726773036.11220: in run() - task 0affffe7-6841-885f-bbcf-0000000000b3 8742 1726773036.11236: variable 'ansible_search_path' from source: unknown 8742 1726773036.11241: variable 'ansible_search_path' from source: unknown 8742 1726773036.11269: calling self._execute() 8742 1726773036.11332: variable 'ansible_host' from source: host vars for 'managed_node2' 8742 1726773036.11340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8742 1726773036.11349: variable 'omit' from source: magic vars 8742 1726773036.11691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8742 1726773036.13431: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8742 1726773036.13499: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8742 1726773036.13535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8742 1726773036.13571: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8742 1726773036.13598: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8742 1726773036.13674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8742 1726773036.13705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8742 1726773036.13726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8742 1726773036.13751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8742 1726773036.13759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8742 1726773036.13845: variable '__kernel_settings_is_transactional' from source: set_fact 8742 1726773036.13860: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8742 1726773036.13863: when evaluation is False, skipping this task 8742 1726773036.13868: _execute() done 8742 1726773036.13871: dumping result to json 8742 1726773036.13873: done dumping result, returning 8742 1726773036.13878: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-885f-bbcf-0000000000b3] 8742 1726773036.13882: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b3 8742 1726773036.13910: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b3 8742 1726773036.13912: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773036.14124: no more pending results, returning what we have 8240 1726773036.14126: results queue empty 8240 1726773036.14127: checking for any_errors_fatal 8240 1726773036.14131: done checking for any_errors_fatal 8240 1726773036.14131: checking for max_fail_percentage 8240 1726773036.14133: done checking for max_fail_percentage 8240 1726773036.14133: checking to see if all hosts have failed and the running result is not ok 8240 1726773036.14134: done checking to see if all hosts have failed 8240 1726773036.14134: getting the remaining hosts for this loop 8240 1726773036.14135: done getting the remaining hosts for this loop 8240 1726773036.14137: getting the next task for host managed_node2 8240 1726773036.14143: done getting next task for host managed_node2 8240 1726773036.14146: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8240 1726773036.14147: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773036.14156: getting variables 8240 1726773036.14157: in VariableManager get_vars() 8240 1726773036.14180: Calling all_inventory to load vars for managed_node2 8240 1726773036.14182: Calling groups_inventory to load vars for managed_node2 8240 1726773036.14184: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773036.14193: Calling all_plugins_play to load vars for managed_node2 8240 1726773036.14196: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773036.14197: Calling groups_plugins_play to load vars for managed_node2 8240 1726773036.14305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773036.14433: done with get_vars() 8240 1726773036.14441: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.037) 0:00:14.788 **** 8240 1726773036.14510: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773036.14511: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8240 1726773036.14676: worker is 1 (out of 1 available) 8240 1726773036.14692: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773036.14705: done queuing things up, now waiting for results queue to drain 8240 1726773036.14708: waiting for pending results... 8744 1726773036.14821: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8744 1726773036.14931: in run() - task 0affffe7-6841-885f-bbcf-0000000000b5 8744 1726773036.14947: variable 'ansible_search_path' from source: unknown 8744 1726773036.14951: variable 'ansible_search_path' from source: unknown 8744 1726773036.14981: calling self._execute() 8744 1726773036.15098: variable 'ansible_host' from source: host vars for 'managed_node2' 8744 1726773036.15106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8744 1726773036.15116: variable 'omit' from source: magic vars 8744 1726773036.15192: variable 'omit' from source: magic vars 8744 1726773036.15226: variable 'omit' from source: magic vars 8744 1726773036.15248: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8744 1726773036.15458: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8744 1726773036.15522: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8744 1726773036.15553: variable 'omit' from source: magic vars 8744 1726773036.15590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8744 1726773036.15617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8744 1726773036.15635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8744 1726773036.15649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8744 1726773036.15661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8744 1726773036.15689: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8744 1726773036.15695: variable 'ansible_host' from source: host vars for 'managed_node2' 8744 1726773036.15700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8744 1726773036.15770: Set connection var ansible_pipelining to False 8744 1726773036.15778: Set connection var ansible_timeout to 10 8744 1726773036.15787: Set connection var ansible_module_compression to ZIP_DEFLATED 8744 1726773036.15791: Set connection var ansible_shell_type to sh 8744 1726773036.15796: Set connection var ansible_shell_executable to /bin/sh 8744 1726773036.15802: Set connection var ansible_connection to ssh 8744 1726773036.15817: variable 'ansible_shell_executable' from source: unknown 8744 1726773036.15821: variable 'ansible_connection' from source: unknown 8744 1726773036.15824: variable 'ansible_module_compression' from source: unknown 8744 1726773036.15828: variable 'ansible_shell_type' from source: unknown 8744 1726773036.15831: variable 'ansible_shell_executable' from source: unknown 8744 1726773036.15834: variable 'ansible_host' from source: host vars for 'managed_node2' 8744 1726773036.15838: variable 'ansible_pipelining' from source: unknown 8744 1726773036.15840: variable 'ansible_timeout' from source: unknown 8744 1726773036.15842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8744 1726773036.15972: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8744 1726773036.15983: variable 'omit' from source: magic vars 8744 1726773036.15991: starting attempt loop 8744 1726773036.15995: running the handler 8744 1726773036.16007: _low_level_execute_command(): starting 8744 1726773036.16020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8744 1726773036.18630: stdout chunk (state=2): >>>/root <<< 8744 1726773036.18796: stderr chunk (state=3): >>><<< 8744 1726773036.18803: stdout chunk (state=3): >>><<< 8744 1726773036.18820: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8744 1726773036.18831: _low_level_execute_command(): starting 8744 1726773036.18835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343 `" && echo ansible-tmp-1726773036.1882644-8744-42224958500343="` echo /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343 `" ) && sleep 0' 8744 1726773036.22038: stdout chunk (state=2): >>>ansible-tmp-1726773036.1882644-8744-42224958500343=/root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343 <<< 8744 1726773036.22093: stderr chunk (state=3): >>><<< 8744 1726773036.22102: stdout chunk (state=3): >>><<< 8744 1726773036.22126: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.1882644-8744-42224958500343=/root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343 , stderr= 8744 1726773036.22175: variable 'ansible_module_compression' from source: unknown 8744 1726773036.22220: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8744 1726773036.22226: ANSIBALLZ: Acquiring lock 8744 1726773036.22230: ANSIBALLZ: Lock acquired: 139787571123152 8744 1726773036.22233: ANSIBALLZ: Creating module 8744 1726773036.35994: ANSIBALLZ: Writing module into payload 8744 1726773036.36056: ANSIBALLZ: Writing module 8744 1726773036.36079: ANSIBALLZ: Renaming module 8744 1726773036.36088: ANSIBALLZ: Done creating module 8744 1726773036.36106: variable 'ansible_facts' from source: unknown 8744 1726773036.36163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/AnsiballZ_kernel_settings_get_config.py 8744 1726773036.36272: Sending initial data 8744 1726773036.36280: Sent initial data (172 bytes) 8744 1726773036.39043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmptjs6xlfn /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/AnsiballZ_kernel_settings_get_config.py <<< 8744 1726773036.40123: stderr chunk (state=3): >>><<< 8744 1726773036.40134: stdout chunk (state=3): >>><<< 8744 1726773036.40154: done transferring module to remote 8744 1726773036.40167: _low_level_execute_command(): starting 8744 1726773036.40174: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/ /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8744 1726773036.42581: stderr chunk (state=2): >>><<< 8744 1726773036.42595: stdout chunk (state=2): >>><<< 8744 1726773036.42611: _low_level_execute_command() done: rc=0, stdout=, stderr= 8744 1726773036.42616: _low_level_execute_command(): starting 8744 1726773036.42621: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8744 1726773036.58403: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8744 1726773036.60655: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8744 1726773036.60669: stdout chunk (state=3): >>><<< 8744 1726773036.60682: stderr chunk (state=3): >>><<< 8744 1726773036.60698: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 8744 1726773036.60730: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8744 1726773036.60742: _low_level_execute_command(): starting 8744 1726773036.60748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.1882644-8744-42224958500343/ > /dev/null 2>&1 && sleep 0' 8744 1726773036.63875: stderr chunk (state=2): >>><<< 8744 1726773036.63888: stdout chunk (state=2): >>><<< 8744 1726773036.63906: _low_level_execute_command() done: rc=0, stdout=, stderr= 8744 1726773036.63915: handler run complete 8744 1726773036.63934: attempt loop complete, returning result 8744 1726773036.63940: _execute() done 8744 1726773036.63943: dumping result to json 8744 1726773036.63947: done dumping result, returning 8744 1726773036.63954: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-885f-bbcf-0000000000b5] 8744 1726773036.63960: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b5 8744 1726773036.64000: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b5 8744 1726773036.64004: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8240 1726773036.64561: no more pending results, returning what we have 8240 1726773036.64563: results queue empty 8240 1726773036.64564: checking for any_errors_fatal 8240 1726773036.64567: done checking for any_errors_fatal 8240 1726773036.64567: checking for max_fail_percentage 8240 1726773036.64568: done checking for max_fail_percentage 8240 1726773036.64569: checking to see if all hosts have failed and the running result is not ok 8240 1726773036.64569: done checking to see if all hosts have failed 8240 1726773036.64570: getting the remaining hosts for this loop 8240 1726773036.64570: done getting the remaining hosts for this loop 8240 1726773036.64573: getting the next task for host managed_node2 8240 1726773036.64577: done getting next task for host managed_node2 8240 1726773036.64579: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8240 1726773036.64581: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773036.64589: getting variables 8240 1726773036.64591: in VariableManager get_vars() 8240 1726773036.64612: Calling all_inventory to load vars for managed_node2 8240 1726773036.64614: Calling groups_inventory to load vars for managed_node2 8240 1726773036.64616: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773036.64623: Calling all_plugins_play to load vars for managed_node2 8240 1726773036.64625: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773036.64632: Calling groups_plugins_play to load vars for managed_node2 8240 1726773036.64737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773036.64858: done with get_vars() 8240 1726773036.64866: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.504) 0:00:15.293 **** 8240 1726773036.64936: entering _queue_task() for managed_node2/stat 8240 1726773036.65104: worker is 1 (out of 1 available) 8240 1726773036.65119: exiting _queue_task() for managed_node2/stat 8240 1726773036.65131: done queuing things up, now waiting for results queue to drain 8240 1726773036.65132: waiting for pending results... 8781 1726773036.65258: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8781 1726773036.65394: in run() - task 0affffe7-6841-885f-bbcf-0000000000b6 8781 1726773036.65412: variable 'ansible_search_path' from source: unknown 8781 1726773036.65417: variable 'ansible_search_path' from source: unknown 8781 1726773036.65453: variable '__prof_from_conf' from source: task vars 8781 1726773036.65749: variable '__prof_from_conf' from source: task vars 8781 1726773036.65935: variable '__data' from source: task vars 8781 1726773036.66013: variable '__kernel_settings_register_tuned_main' from source: set_fact 8781 1726773036.66219: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8781 1726773036.66230: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8781 1726773036.66291: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8781 1726773036.66309: variable 'omit' from source: magic vars 8781 1726773036.66412: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773036.66425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773036.66435: variable 'omit' from source: magic vars 8781 1726773036.66689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8781 1726773036.68816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8781 1726773036.68889: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8781 1726773036.68926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8781 1726773036.68963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8781 1726773036.68994: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8781 1726773036.69071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8781 1726773036.69101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8781 1726773036.69125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8781 1726773036.69164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8781 1726773036.69181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8781 1726773036.69279: variable 'item' from source: unknown 8781 1726773036.69298: Evaluated conditional (item | length > 0): False 8781 1726773036.69303: when evaluation is False, skipping this task 8781 1726773036.69338: variable 'item' from source: unknown 8781 1726773036.69417: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 8781 1726773036.69509: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773036.69520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773036.69528: variable 'omit' from source: magic vars 8781 1726773036.69646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8781 1726773036.69665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8781 1726773036.69683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8781 1726773036.69714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8781 1726773036.69726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8781 1726773036.69781: variable 'item' from source: unknown 8781 1726773036.69792: Evaluated conditional (item | length > 0): True 8781 1726773036.69801: variable 'omit' from source: magic vars 8781 1726773036.69831: variable 'omit' from source: magic vars 8781 1726773036.69860: variable 'item' from source: unknown 8781 1726773036.69906: variable 'item' from source: unknown 8781 1726773036.69922: variable 'omit' from source: magic vars 8781 1726773036.69944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8781 1726773036.69964: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8781 1726773036.69983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8781 1726773036.70002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8781 1726773036.70012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8781 1726773036.70037: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8781 1726773036.70042: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773036.70046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773036.70114: Set connection var ansible_pipelining to False 8781 1726773036.70122: Set connection var ansible_timeout to 10 8781 1726773036.70129: Set connection var ansible_module_compression to ZIP_DEFLATED 8781 1726773036.70133: Set connection var ansible_shell_type to sh 8781 1726773036.70138: Set connection var ansible_shell_executable to /bin/sh 8781 1726773036.70143: Set connection var ansible_connection to ssh 8781 1726773036.70157: variable 'ansible_shell_executable' from source: unknown 8781 1726773036.70161: variable 'ansible_connection' from source: unknown 8781 1726773036.70164: variable 'ansible_module_compression' from source: unknown 8781 1726773036.70167: variable 'ansible_shell_type' from source: unknown 8781 1726773036.70170: variable 'ansible_shell_executable' from source: unknown 8781 1726773036.70173: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773036.70177: variable 'ansible_pipelining' from source: unknown 8781 1726773036.70181: variable 'ansible_timeout' from source: unknown 8781 1726773036.70192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773036.70290: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8781 1726773036.70301: variable 'omit' from source: magic vars 8781 1726773036.70307: starting attempt loop 8781 1726773036.70311: running the handler 8781 1726773036.70322: _low_level_execute_command(): starting 8781 1726773036.70328: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8781 1726773036.72882: stdout chunk (state=2): >>>/root <<< 8781 1726773036.73004: stderr chunk (state=3): >>><<< 8781 1726773036.73015: stdout chunk (state=3): >>><<< 8781 1726773036.73034: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8781 1726773036.73047: _low_level_execute_command(): starting 8781 1726773036.73055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813 `" && echo ansible-tmp-1726773036.7304246-8781-103618955862813="` echo /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813 `" ) && sleep 0' 8781 1726773036.75714: stdout chunk (state=2): >>>ansible-tmp-1726773036.7304246-8781-103618955862813=/root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813 <<< 8781 1726773036.75854: stderr chunk (state=3): >>><<< 8781 1726773036.75864: stdout chunk (state=3): >>><<< 8781 1726773036.75887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.7304246-8781-103618955862813=/root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813 , stderr= 8781 1726773036.75934: variable 'ansible_module_compression' from source: unknown 8781 1726773036.75992: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8781 1726773036.76026: variable 'ansible_facts' from source: unknown 8781 1726773036.76123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/AnsiballZ_stat.py 8781 1726773036.76570: Sending initial data 8781 1726773036.76577: Sent initial data (151 bytes) 8781 1726773036.78895: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp7ko1behy /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/AnsiballZ_stat.py <<< 8781 1726773036.80006: stderr chunk (state=3): >>><<< 8781 1726773036.80017: stdout chunk (state=3): >>><<< 8781 1726773036.80038: done transferring module to remote 8781 1726773036.80050: _low_level_execute_command(): starting 8781 1726773036.80056: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/ /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/AnsiballZ_stat.py && sleep 0' 8781 1726773036.82474: stderr chunk (state=2): >>><<< 8781 1726773036.82487: stdout chunk (state=2): >>><<< 8781 1726773036.82503: _low_level_execute_command() done: rc=0, stdout=, stderr= 8781 1726773036.82507: _low_level_execute_command(): starting 8781 1726773036.82512: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/AnsiballZ_stat.py && sleep 0' 8781 1726773036.97603: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8781 1726773036.98628: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8781 1726773036.98640: stdout chunk (state=3): >>><<< 8781 1726773036.98651: stderr chunk (state=3): >>><<< 8781 1726773036.98664: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 8781 1726773036.98692: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8781 1726773036.98705: _low_level_execute_command(): starting 8781 1726773036.98711: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.7304246-8781-103618955862813/ > /dev/null 2>&1 && sleep 0' 8781 1726773037.01361: stderr chunk (state=2): >>><<< 8781 1726773037.01373: stdout chunk (state=2): >>><<< 8781 1726773037.01391: _low_level_execute_command() done: rc=0, stdout=, stderr= 8781 1726773037.01399: handler run complete 8781 1726773037.01416: attempt loop complete, returning result 8781 1726773037.01431: variable 'item' from source: unknown 8781 1726773037.01491: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8781 1726773037.01580: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773037.01593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773037.01603: variable 'omit' from source: magic vars 8781 1726773037.01714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8781 1726773037.01736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8781 1726773037.01755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8781 1726773037.01782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8781 1726773037.01796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8781 1726773037.01854: variable 'item' from source: unknown 8781 1726773037.01863: Evaluated conditional (item | length > 0): True 8781 1726773037.01869: variable 'omit' from source: magic vars 8781 1726773037.01881: variable 'omit' from source: magic vars 8781 1726773037.01910: variable 'item' from source: unknown 8781 1726773037.01955: variable 'item' from source: unknown 8781 1726773037.01970: variable 'omit' from source: magic vars 8781 1726773037.01990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8781 1726773037.01999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8781 1726773037.02006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8781 1726773037.02019: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8781 1726773037.02023: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773037.02027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773037.02079: Set connection var ansible_pipelining to False 8781 1726773037.02088: Set connection var ansible_timeout to 10 8781 1726773037.02095: Set connection var ansible_module_compression to ZIP_DEFLATED 8781 1726773037.02099: Set connection var ansible_shell_type to sh 8781 1726773037.02103: Set connection var ansible_shell_executable to /bin/sh 8781 1726773037.02108: Set connection var ansible_connection to ssh 8781 1726773037.02121: variable 'ansible_shell_executable' from source: unknown 8781 1726773037.02124: variable 'ansible_connection' from source: unknown 8781 1726773037.02128: variable 'ansible_module_compression' from source: unknown 8781 1726773037.02131: variable 'ansible_shell_type' from source: unknown 8781 1726773037.02134: variable 'ansible_shell_executable' from source: unknown 8781 1726773037.02137: variable 'ansible_host' from source: host vars for 'managed_node2' 8781 1726773037.02141: variable 'ansible_pipelining' from source: unknown 8781 1726773037.02144: variable 'ansible_timeout' from source: unknown 8781 1726773037.02149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8781 1726773037.02219: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8781 1726773037.02229: variable 'omit' from source: magic vars 8781 1726773037.02234: starting attempt loop 8781 1726773037.02238: running the handler 8781 1726773037.02244: _low_level_execute_command(): starting 8781 1726773037.02249: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8781 1726773037.04505: stdout chunk (state=2): >>>/root <<< 8781 1726773037.04621: stderr chunk (state=3): >>><<< 8781 1726773037.04629: stdout chunk (state=3): >>><<< 8781 1726773037.04643: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8781 1726773037.04653: _low_level_execute_command(): starting 8781 1726773037.04658: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740 `" && echo ansible-tmp-1726773037.0464928-8781-198706033437740="` echo /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740 `" ) && sleep 0' 8781 1726773037.07099: stdout chunk (state=2): >>>ansible-tmp-1726773037.0464928-8781-198706033437740=/root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740 <<< 8781 1726773037.07230: stderr chunk (state=3): >>><<< 8781 1726773037.07238: stdout chunk (state=3): >>><<< 8781 1726773037.07252: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.0464928-8781-198706033437740=/root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740 , stderr= 8781 1726773037.07284: variable 'ansible_module_compression' from source: unknown 8781 1726773037.07321: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8781 1726773037.07338: variable 'ansible_facts' from source: unknown 8781 1726773037.07396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/AnsiballZ_stat.py 8781 1726773037.07479: Sending initial data 8781 1726773037.07488: Sent initial data (151 bytes) 8781 1726773037.10007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpvi5cwl5b /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/AnsiballZ_stat.py <<< 8781 1726773037.11198: stderr chunk (state=3): >>><<< 8781 1726773037.11206: stdout chunk (state=3): >>><<< 8781 1726773037.11224: done transferring module to remote 8781 1726773037.11233: _low_level_execute_command(): starting 8781 1726773037.11239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/ /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/AnsiballZ_stat.py && sleep 0' 8781 1726773037.13596: stderr chunk (state=2): >>><<< 8781 1726773037.13606: stdout chunk (state=2): >>><<< 8781 1726773037.13622: _low_level_execute_command() done: rc=0, stdout=, stderr= 8781 1726773037.13626: _low_level_execute_command(): starting 8781 1726773037.13632: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/AnsiballZ_stat.py && sleep 0' 8781 1726773037.29495: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773032.3730228, "mtime": 1726773030.476004, "ctime": 1726773030.476004, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8781 1726773037.30537: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8781 1726773037.30580: stderr chunk (state=3): >>><<< 8781 1726773037.30588: stdout chunk (state=3): >>><<< 8781 1726773037.30601: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773032.3730228, "mtime": 1726773030.476004, "ctime": 1726773030.476004, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 8781 1726773037.30631: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8781 1726773037.30638: _low_level_execute_command(): starting 8781 1726773037.30641: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.0464928-8781-198706033437740/ > /dev/null 2>&1 && sleep 0' 8781 1726773037.33145: stderr chunk (state=2): >>><<< 8781 1726773037.33160: stdout chunk (state=2): >>><<< 8781 1726773037.33178: _low_level_execute_command() done: rc=0, stdout=, stderr= 8781 1726773037.33186: handler run complete 8781 1726773037.33236: attempt loop complete, returning result 8781 1726773037.33257: variable 'item' from source: unknown 8781 1726773037.33340: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773032.3730228, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773030.476004, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773030.476004, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8781 1726773037.33395: dumping result to json 8781 1726773037.33406: done dumping result, returning 8781 1726773037.33415: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-885f-bbcf-0000000000b6] 8781 1726773037.33421: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b6 8781 1726773037.33462: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b6 8781 1726773037.33466: WORKER PROCESS EXITING 8240 1726773037.33681: no more pending results, returning what we have 8240 1726773037.33684: results queue empty 8240 1726773037.33687: checking for any_errors_fatal 8240 1726773037.33692: done checking for any_errors_fatal 8240 1726773037.33692: checking for max_fail_percentage 8240 1726773037.33693: done checking for max_fail_percentage 8240 1726773037.33694: checking to see if all hosts have failed and the running result is not ok 8240 1726773037.33695: done checking to see if all hosts have failed 8240 1726773037.33695: getting the remaining hosts for this loop 8240 1726773037.33696: done getting the remaining hosts for this loop 8240 1726773037.33699: getting the next task for host managed_node2 8240 1726773037.33704: done getting next task for host managed_node2 8240 1726773037.33706: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8240 1726773037.33710: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773037.33718: getting variables 8240 1726773037.33720: in VariableManager get_vars() 8240 1726773037.33750: Calling all_inventory to load vars for managed_node2 8240 1726773037.33752: Calling groups_inventory to load vars for managed_node2 8240 1726773037.33754: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773037.33761: Calling all_plugins_play to load vars for managed_node2 8240 1726773037.33763: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773037.33764: Calling groups_plugins_play to load vars for managed_node2 8240 1726773037.33958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773037.34180: done with get_vars() 8240 1726773037.34196: done getting variables 8240 1726773037.34255: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:10:37 -0400 (0:00:00.693) 0:00:15.986 **** 8240 1726773037.34287: entering _queue_task() for managed_node2/set_fact 8240 1726773037.34520: worker is 1 (out of 1 available) 8240 1726773037.34535: exiting _queue_task() for managed_node2/set_fact 8240 1726773037.34550: done queuing things up, now waiting for results queue to drain 8240 1726773037.34552: waiting for pending results... 8831 1726773037.34979: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8831 1726773037.35100: in run() - task 0affffe7-6841-885f-bbcf-0000000000b7 8831 1726773037.35118: variable 'ansible_search_path' from source: unknown 8831 1726773037.35122: variable 'ansible_search_path' from source: unknown 8831 1726773037.35152: calling self._execute() 8831 1726773037.35225: variable 'ansible_host' from source: host vars for 'managed_node2' 8831 1726773037.35234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8831 1726773037.35244: variable 'omit' from source: magic vars 8831 1726773037.35331: variable 'omit' from source: magic vars 8831 1726773037.35378: variable 'omit' from source: magic vars 8831 1726773037.35780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8831 1726773037.37869: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8831 1726773037.37922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8831 1726773037.37952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8831 1726773037.37979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8831 1726773037.38002: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8831 1726773037.38059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8831 1726773037.38082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8831 1726773037.38106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8831 1726773037.38137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8831 1726773037.38149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8831 1726773037.38184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8831 1726773037.38204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8831 1726773037.38222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8831 1726773037.38250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8831 1726773037.38261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8831 1726773037.38314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8831 1726773037.38333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8831 1726773037.38353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8831 1726773037.38380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8831 1726773037.38393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8831 1726773037.38546: variable '__kernel_settings_find_profile_dirs' from source: set_fact 8831 1726773037.38618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8831 1726773037.38730: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8831 1726773037.38759: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8831 1726773037.38787: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8831 1726773037.38809: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8831 1726773037.38853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8831 1726773037.38873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8831 1726773037.38893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8831 1726773037.38918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8831 1726773037.38975: variable 'omit' from source: magic vars 8831 1726773037.39002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8831 1726773037.39027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8831 1726773037.39045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8831 1726773037.39062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8831 1726773037.39076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8831 1726773037.39107: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8831 1726773037.39112: variable 'ansible_host' from source: host vars for 'managed_node2' 8831 1726773037.39116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8831 1726773037.39436: Set connection var ansible_pipelining to False 8831 1726773037.39446: Set connection var ansible_timeout to 10 8831 1726773037.39454: Set connection var ansible_module_compression to ZIP_DEFLATED 8831 1726773037.39457: Set connection var ansible_shell_type to sh 8831 1726773037.39462: Set connection var ansible_shell_executable to /bin/sh 8831 1726773037.39469: Set connection var ansible_connection to ssh 8831 1726773037.39494: variable 'ansible_shell_executable' from source: unknown 8831 1726773037.39499: variable 'ansible_connection' from source: unknown 8831 1726773037.39503: variable 'ansible_module_compression' from source: unknown 8831 1726773037.39505: variable 'ansible_shell_type' from source: unknown 8831 1726773037.39508: variable 'ansible_shell_executable' from source: unknown 8831 1726773037.39511: variable 'ansible_host' from source: host vars for 'managed_node2' 8831 1726773037.39514: variable 'ansible_pipelining' from source: unknown 8831 1726773037.39516: variable 'ansible_timeout' from source: unknown 8831 1726773037.39520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8831 1726773037.39605: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8831 1726773037.39617: variable 'omit' from source: magic vars 8831 1726773037.39623: starting attempt loop 8831 1726773037.39626: running the handler 8831 1726773037.39638: handler run complete 8831 1726773037.39648: attempt loop complete, returning result 8831 1726773037.39651: _execute() done 8831 1726773037.39654: dumping result to json 8831 1726773037.39657: done dumping result, returning 8831 1726773037.39663: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-885f-bbcf-0000000000b7] 8831 1726773037.39672: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b7 8831 1726773037.39701: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b7 8831 1726773037.39704: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8240 1726773037.39853: no more pending results, returning what we have 8240 1726773037.39856: results queue empty 8240 1726773037.39856: checking for any_errors_fatal 8240 1726773037.39863: done checking for any_errors_fatal 8240 1726773037.39864: checking for max_fail_percentage 8240 1726773037.39865: done checking for max_fail_percentage 8240 1726773037.39866: checking to see if all hosts have failed and the running result is not ok 8240 1726773037.39866: done checking to see if all hosts have failed 8240 1726773037.39867: getting the remaining hosts for this loop 8240 1726773037.39868: done getting the remaining hosts for this loop 8240 1726773037.39871: getting the next task for host managed_node2 8240 1726773037.39876: done getting next task for host managed_node2 8240 1726773037.39879: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8240 1726773037.39881: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773037.39892: getting variables 8240 1726773037.39893: in VariableManager get_vars() 8240 1726773037.39923: Calling all_inventory to load vars for managed_node2 8240 1726773037.39926: Calling groups_inventory to load vars for managed_node2 8240 1726773037.39928: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773037.39937: Calling all_plugins_play to load vars for managed_node2 8240 1726773037.39939: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773037.39942: Calling groups_plugins_play to load vars for managed_node2 8240 1726773037.40109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773037.40306: done with get_vars() 8240 1726773037.40318: done getting variables 8240 1726773037.40376: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:10:37 -0400 (0:00:00.061) 0:00:16.047 **** 8240 1726773037.40410: entering _queue_task() for managed_node2/service 8240 1726773037.40626: worker is 1 (out of 1 available) 8240 1726773037.40639: exiting _queue_task() for managed_node2/service 8240 1726773037.40651: done queuing things up, now waiting for results queue to drain 8240 1726773037.40653: waiting for pending results... 8838 1726773037.40913: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8838 1726773037.41046: in run() - task 0affffe7-6841-885f-bbcf-0000000000b8 8838 1726773037.41061: variable 'ansible_search_path' from source: unknown 8838 1726773037.41064: variable 'ansible_search_path' from source: unknown 8838 1726773037.41099: variable '__kernel_settings_services' from source: include_vars 8838 1726773037.41401: variable '__kernel_settings_services' from source: include_vars 8838 1726773037.41456: variable 'omit' from source: magic vars 8838 1726773037.41527: variable 'ansible_host' from source: host vars for 'managed_node2' 8838 1726773037.41537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8838 1726773037.41543: variable 'omit' from source: magic vars 8838 1726773037.41593: variable 'omit' from source: magic vars 8838 1726773037.41621: variable 'omit' from source: magic vars 8838 1726773037.41658: variable 'item' from source: unknown 8838 1726773037.41720: variable 'item' from source: unknown 8838 1726773037.41741: variable 'omit' from source: magic vars 8838 1726773037.41775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8838 1726773037.41803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8838 1726773037.41822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8838 1726773037.41836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8838 1726773037.41846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8838 1726773037.41871: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8838 1726773037.41875: variable 'ansible_host' from source: host vars for 'managed_node2' 8838 1726773037.41878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8838 1726773037.41946: Set connection var ansible_pipelining to False 8838 1726773037.41954: Set connection var ansible_timeout to 10 8838 1726773037.41962: Set connection var ansible_module_compression to ZIP_DEFLATED 8838 1726773037.41965: Set connection var ansible_shell_type to sh 8838 1726773037.41969: Set connection var ansible_shell_executable to /bin/sh 8838 1726773037.41972: Set connection var ansible_connection to ssh 8838 1726773037.41984: variable 'ansible_shell_executable' from source: unknown 8838 1726773037.41988: variable 'ansible_connection' from source: unknown 8838 1726773037.41990: variable 'ansible_module_compression' from source: unknown 8838 1726773037.41992: variable 'ansible_shell_type' from source: unknown 8838 1726773037.41993: variable 'ansible_shell_executable' from source: unknown 8838 1726773037.41995: variable 'ansible_host' from source: host vars for 'managed_node2' 8838 1726773037.41997: variable 'ansible_pipelining' from source: unknown 8838 1726773037.41999: variable 'ansible_timeout' from source: unknown 8838 1726773037.42001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8838 1726773037.42091: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8838 1726773037.42102: variable 'omit' from source: magic vars 8838 1726773037.42109: starting attempt loop 8838 1726773037.42112: running the handler 8838 1726773037.42171: variable 'ansible_facts' from source: unknown 8838 1726773037.42248: _low_level_execute_command(): starting 8838 1726773037.42255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8838 1726773037.44654: stdout chunk (state=2): >>>/root <<< 8838 1726773037.45069: stderr chunk (state=3): >>><<< 8838 1726773037.45078: stdout chunk (state=3): >>><<< 8838 1726773037.45100: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8838 1726773037.45114: _low_level_execute_command(): starting 8838 1726773037.45121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090 `" && echo ansible-tmp-1726773037.451086-8838-95848653438090="` echo /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090 `" ) && sleep 0' 8838 1726773037.47651: stdout chunk (state=2): >>>ansible-tmp-1726773037.451086-8838-95848653438090=/root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090 <<< 8838 1726773037.47789: stderr chunk (state=3): >>><<< 8838 1726773037.47796: stdout chunk (state=3): >>><<< 8838 1726773037.47812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.451086-8838-95848653438090=/root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090 , stderr= 8838 1726773037.47838: variable 'ansible_module_compression' from source: unknown 8838 1726773037.47889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8838 1726773037.47943: variable 'ansible_facts' from source: unknown 8838 1726773037.48107: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/AnsiballZ_systemd.py 8838 1726773037.48215: Sending initial data 8838 1726773037.48223: Sent initial data (152 bytes) 8838 1726773037.50783: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp0qc6j52i /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/AnsiballZ_systemd.py <<< 8838 1726773037.52794: stderr chunk (state=3): >>><<< 8838 1726773037.52805: stdout chunk (state=3): >>><<< 8838 1726773037.52828: done transferring module to remote 8838 1726773037.52840: _low_level_execute_command(): starting 8838 1726773037.52845: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/ /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/AnsiballZ_systemd.py && sleep 0' 8838 1726773037.55279: stderr chunk (state=2): >>><<< 8838 1726773037.55290: stdout chunk (state=2): >>><<< 8838 1726773037.55305: _low_level_execute_command() done: rc=0, stdout=, stderr= 8838 1726773037.55309: _low_level_execute_command(): starting 8838 1726773037.55315: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/AnsiballZ_systemd.py && sleep 0' 8838 1726773037.84475: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8454", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15044608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<< 8838 1726773037.84497: stdout chunk (state=3): >>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8838 1726773037.86168: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8838 1726773037.86215: stderr chunk (state=3): >>><<< 8838 1726773037.86222: stdout chunk (state=3): >>><<< 8838 1726773037.86242: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8454", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15044608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8838 1726773037.86358: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8838 1726773037.86378: _low_level_execute_command(): starting 8838 1726773037.86386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.451086-8838-95848653438090/ > /dev/null 2>&1 && sleep 0' 8838 1726773037.88843: stderr chunk (state=2): >>><<< 8838 1726773037.88853: stdout chunk (state=2): >>><<< 8838 1726773037.88869: _low_level_execute_command() done: rc=0, stdout=, stderr= 8838 1726773037.88877: handler run complete 8838 1726773037.88914: attempt loop complete, returning result 8838 1726773037.88930: variable 'item' from source: unknown 8838 1726773037.88992: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "ActiveState": "active", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8454", "MemoryAccounting": "yes", "MemoryCurrent": "15044608", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "WatchdogUSec": "0" } } 8838 1726773037.89087: dumping result to json 8838 1726773037.89107: done dumping result, returning 8838 1726773037.89115: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-0000000000b8] 8838 1726773037.89121: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b8 8838 1726773037.89224: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b8 8838 1726773037.89229: WORKER PROCESS EXITING 8240 1726773037.89572: no more pending results, returning what we have 8240 1726773037.89574: results queue empty 8240 1726773037.89575: checking for any_errors_fatal 8240 1726773037.89578: done checking for any_errors_fatal 8240 1726773037.89578: checking for max_fail_percentage 8240 1726773037.89579: done checking for max_fail_percentage 8240 1726773037.89579: checking to see if all hosts have failed and the running result is not ok 8240 1726773037.89580: done checking to see if all hosts have failed 8240 1726773037.89580: getting the remaining hosts for this loop 8240 1726773037.89581: done getting the remaining hosts for this loop 8240 1726773037.89583: getting the next task for host managed_node2 8240 1726773037.89589: done getting next task for host managed_node2 8240 1726773037.89591: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8240 1726773037.89592: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773037.89599: getting variables 8240 1726773037.89600: in VariableManager get_vars() 8240 1726773037.89619: Calling all_inventory to load vars for managed_node2 8240 1726773037.89620: Calling groups_inventory to load vars for managed_node2 8240 1726773037.89622: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773037.89628: Calling all_plugins_play to load vars for managed_node2 8240 1726773037.89630: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773037.89631: Calling groups_plugins_play to load vars for managed_node2 8240 1726773037.89727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773037.89841: done with get_vars() 8240 1726773037.89848: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:10:37 -0400 (0:00:00.495) 0:00:16.542 **** 8240 1726773037.89913: entering _queue_task() for managed_node2/file 8240 1726773037.90068: worker is 1 (out of 1 available) 8240 1726773037.90082: exiting _queue_task() for managed_node2/file 8240 1726773037.90094: done queuing things up, now waiting for results queue to drain 8240 1726773037.90096: waiting for pending results... 8860 1726773037.90217: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8860 1726773037.90325: in run() - task 0affffe7-6841-885f-bbcf-0000000000b9 8860 1726773037.90341: variable 'ansible_search_path' from source: unknown 8860 1726773037.90345: variable 'ansible_search_path' from source: unknown 8860 1726773037.90374: calling self._execute() 8860 1726773037.90435: variable 'ansible_host' from source: host vars for 'managed_node2' 8860 1726773037.90443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8860 1726773037.90449: variable 'omit' from source: magic vars 8860 1726773037.90518: variable 'omit' from source: magic vars 8860 1726773037.90553: variable 'omit' from source: magic vars 8860 1726773037.90573: variable '__kernel_settings_profile_dir' from source: role '' all vars 8860 1726773037.90788: variable '__kernel_settings_profile_dir' from source: role '' all vars 8860 1726773037.90855: variable '__kernel_settings_profile_parent' from source: set_fact 8860 1726773037.90863: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8860 1726773037.90918: variable 'omit' from source: magic vars 8860 1726773037.90945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8860 1726773037.90973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8860 1726773037.90993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8860 1726773037.91006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8860 1726773037.91014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8860 1726773037.91035: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8860 1726773037.91038: variable 'ansible_host' from source: host vars for 'managed_node2' 8860 1726773037.91041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8860 1726773037.91131: Set connection var ansible_pipelining to False 8860 1726773037.91139: Set connection var ansible_timeout to 10 8860 1726773037.91147: Set connection var ansible_module_compression to ZIP_DEFLATED 8860 1726773037.91150: Set connection var ansible_shell_type to sh 8860 1726773037.91155: Set connection var ansible_shell_executable to /bin/sh 8860 1726773037.91160: Set connection var ansible_connection to ssh 8860 1726773037.91178: variable 'ansible_shell_executable' from source: unknown 8860 1726773037.91182: variable 'ansible_connection' from source: unknown 8860 1726773037.91186: variable 'ansible_module_compression' from source: unknown 8860 1726773037.91190: variable 'ansible_shell_type' from source: unknown 8860 1726773037.91192: variable 'ansible_shell_executable' from source: unknown 8860 1726773037.91194: variable 'ansible_host' from source: host vars for 'managed_node2' 8860 1726773037.91197: variable 'ansible_pipelining' from source: unknown 8860 1726773037.91200: variable 'ansible_timeout' from source: unknown 8860 1726773037.91202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8860 1726773037.91334: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8860 1726773037.91343: variable 'omit' from source: magic vars 8860 1726773037.91347: starting attempt loop 8860 1726773037.91350: running the handler 8860 1726773037.91359: _low_level_execute_command(): starting 8860 1726773037.91368: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8860 1726773037.93691: stdout chunk (state=2): >>>/root <<< 8860 1726773037.93815: stderr chunk (state=3): >>><<< 8860 1726773037.93824: stdout chunk (state=3): >>><<< 8860 1726773037.93846: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8860 1726773037.93860: _low_level_execute_command(): starting 8860 1726773037.93869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995 `" && echo ansible-tmp-1726773037.9385397-8860-87302332573995="` echo /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995 `" ) && sleep 0' 8860 1726773037.96355: stdout chunk (state=2): >>>ansible-tmp-1726773037.9385397-8860-87302332573995=/root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995 <<< 8860 1726773037.96490: stderr chunk (state=3): >>><<< 8860 1726773037.96498: stdout chunk (state=3): >>><<< 8860 1726773037.96514: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.9385397-8860-87302332573995=/root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995 , stderr= 8860 1726773037.96551: variable 'ansible_module_compression' from source: unknown 8860 1726773037.96598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 8860 1726773037.96629: variable 'ansible_facts' from source: unknown 8860 1726773037.96701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/AnsiballZ_file.py 8860 1726773037.96808: Sending initial data 8860 1726773037.96815: Sent initial data (150 bytes) 8860 1726773037.99364: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpb683rc0o /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/AnsiballZ_file.py <<< 8860 1726773038.00489: stderr chunk (state=3): >>><<< 8860 1726773038.00498: stdout chunk (state=3): >>><<< 8860 1726773038.00518: done transferring module to remote 8860 1726773038.00529: _low_level_execute_command(): starting 8860 1726773038.00534: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/ /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/AnsiballZ_file.py && sleep 0' 8860 1726773038.02934: stderr chunk (state=2): >>><<< 8860 1726773038.02945: stdout chunk (state=2): >>><<< 8860 1726773038.02961: _low_level_execute_command() done: rc=0, stdout=, stderr= 8860 1726773038.02965: _low_level_execute_command(): starting 8860 1726773038.02971: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/AnsiballZ_file.py && sleep 0' 8860 1726773038.18773: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8860 1726773038.19863: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8860 1726773038.19915: stderr chunk (state=3): >>><<< 8860 1726773038.19923: stdout chunk (state=3): >>><<< 8860 1726773038.19940: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8860 1726773038.19974: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8860 1726773038.19984: _low_level_execute_command(): starting 8860 1726773038.19990: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.9385397-8860-87302332573995/ > /dev/null 2>&1 && sleep 0' 8860 1726773038.22451: stderr chunk (state=2): >>><<< 8860 1726773038.22462: stdout chunk (state=2): >>><<< 8860 1726773038.22481: _low_level_execute_command() done: rc=0, stdout=, stderr= 8860 1726773038.22490: handler run complete 8860 1726773038.22509: attempt loop complete, returning result 8860 1726773038.22513: _execute() done 8860 1726773038.22516: dumping result to json 8860 1726773038.22524: done dumping result, returning 8860 1726773038.22532: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-0000000000b9] 8860 1726773038.22541: sending task result for task 0affffe7-6841-885f-bbcf-0000000000b9 8860 1726773038.22576: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000b9 8860 1726773038.22580: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8240 1726773038.22729: no more pending results, returning what we have 8240 1726773038.22732: results queue empty 8240 1726773038.22733: checking for any_errors_fatal 8240 1726773038.22747: done checking for any_errors_fatal 8240 1726773038.22747: checking for max_fail_percentage 8240 1726773038.22749: done checking for max_fail_percentage 8240 1726773038.22749: checking to see if all hosts have failed and the running result is not ok 8240 1726773038.22750: done checking to see if all hosts have failed 8240 1726773038.22751: getting the remaining hosts for this loop 8240 1726773038.22752: done getting the remaining hosts for this loop 8240 1726773038.22755: getting the next task for host managed_node2 8240 1726773038.22760: done getting next task for host managed_node2 8240 1726773038.22763: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8240 1726773038.22765: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773038.22775: getting variables 8240 1726773038.22776: in VariableManager get_vars() 8240 1726773038.22808: Calling all_inventory to load vars for managed_node2 8240 1726773038.22810: Calling groups_inventory to load vars for managed_node2 8240 1726773038.22812: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773038.22820: Calling all_plugins_play to load vars for managed_node2 8240 1726773038.22823: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773038.22825: Calling groups_plugins_play to load vars for managed_node2 8240 1726773038.22930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773038.23048: done with get_vars() 8240 1726773038.23059: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.332) 0:00:16.875 **** 8240 1726773038.23125: entering _queue_task() for managed_node2/slurp 8240 1726773038.23294: worker is 1 (out of 1 available) 8240 1726773038.23308: exiting _queue_task() for managed_node2/slurp 8240 1726773038.23320: done queuing things up, now waiting for results queue to drain 8240 1726773038.23322: waiting for pending results... 8875 1726773038.23442: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8875 1726773038.23552: in run() - task 0affffe7-6841-885f-bbcf-0000000000ba 8875 1726773038.23572: variable 'ansible_search_path' from source: unknown 8875 1726773038.23577: variable 'ansible_search_path' from source: unknown 8875 1726773038.23607: calling self._execute() 8875 1726773038.23672: variable 'ansible_host' from source: host vars for 'managed_node2' 8875 1726773038.23681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8875 1726773038.23691: variable 'omit' from source: magic vars 8875 1726773038.23768: variable 'omit' from source: magic vars 8875 1726773038.23805: variable 'omit' from source: magic vars 8875 1726773038.23827: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8875 1726773038.24047: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8875 1726773038.24110: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8875 1726773038.24141: variable 'omit' from source: magic vars 8875 1726773038.24178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8875 1726773038.24208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8875 1726773038.24228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8875 1726773038.24242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8875 1726773038.24254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8875 1726773038.24281: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8875 1726773038.24288: variable 'ansible_host' from source: host vars for 'managed_node2' 8875 1726773038.24292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8875 1726773038.24358: Set connection var ansible_pipelining to False 8875 1726773038.24364: Set connection var ansible_timeout to 10 8875 1726773038.24373: Set connection var ansible_module_compression to ZIP_DEFLATED 8875 1726773038.24376: Set connection var ansible_shell_type to sh 8875 1726773038.24379: Set connection var ansible_shell_executable to /bin/sh 8875 1726773038.24382: Set connection var ansible_connection to ssh 8875 1726773038.24408: variable 'ansible_shell_executable' from source: unknown 8875 1726773038.24472: variable 'ansible_connection' from source: unknown 8875 1726773038.24478: variable 'ansible_module_compression' from source: unknown 8875 1726773038.24481: variable 'ansible_shell_type' from source: unknown 8875 1726773038.24484: variable 'ansible_shell_executable' from source: unknown 8875 1726773038.24488: variable 'ansible_host' from source: host vars for 'managed_node2' 8875 1726773038.24492: variable 'ansible_pipelining' from source: unknown 8875 1726773038.24495: variable 'ansible_timeout' from source: unknown 8875 1726773038.24499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8875 1726773038.24643: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8875 1726773038.24655: variable 'omit' from source: magic vars 8875 1726773038.24661: starting attempt loop 8875 1726773038.24664: running the handler 8875 1726773038.24680: _low_level_execute_command(): starting 8875 1726773038.24689: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8875 1726773038.27047: stdout chunk (state=2): >>>/root <<< 8875 1726773038.27171: stderr chunk (state=3): >>><<< 8875 1726773038.27179: stdout chunk (state=3): >>><<< 8875 1726773038.27201: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8875 1726773038.27216: _low_level_execute_command(): starting 8875 1726773038.27223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371 `" && echo ansible-tmp-1726773038.2721014-8875-144238566710371="` echo /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371 `" ) && sleep 0' 8875 1726773038.29732: stdout chunk (state=2): >>>ansible-tmp-1726773038.2721014-8875-144238566710371=/root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371 <<< 8875 1726773038.29867: stderr chunk (state=3): >>><<< 8875 1726773038.29874: stdout chunk (state=3): >>><<< 8875 1726773038.29893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.2721014-8875-144238566710371=/root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371 , stderr= 8875 1726773038.29931: variable 'ansible_module_compression' from source: unknown 8875 1726773038.29974: ANSIBALLZ: Using lock for slurp 8875 1726773038.29980: ANSIBALLZ: Acquiring lock 8875 1726773038.29983: ANSIBALLZ: Lock acquired: 139787572729072 8875 1726773038.29989: ANSIBALLZ: Creating module 8875 1726773038.43929: ANSIBALLZ: Writing module into payload 8875 1726773038.44010: ANSIBALLZ: Writing module 8875 1726773038.44035: ANSIBALLZ: Renaming module 8875 1726773038.44043: ANSIBALLZ: Done creating module 8875 1726773038.44062: variable 'ansible_facts' from source: unknown 8875 1726773038.44147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/AnsiballZ_slurp.py 8875 1726773038.44635: Sending initial data 8875 1726773038.44642: Sent initial data (152 bytes) 8875 1726773038.47196: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmphipdo7fv /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/AnsiballZ_slurp.py <<< 8875 1726773038.48712: stderr chunk (state=3): >>><<< 8875 1726773038.48722: stdout chunk (state=3): >>><<< 8875 1726773038.48748: done transferring module to remote 8875 1726773038.48762: _low_level_execute_command(): starting 8875 1726773038.48768: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/ /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/AnsiballZ_slurp.py && sleep 0' 8875 1726773038.51809: stderr chunk (state=2): >>><<< 8875 1726773038.51818: stdout chunk (state=2): >>><<< 8875 1726773038.51831: _low_level_execute_command() done: rc=0, stdout=, stderr= 8875 1726773038.51834: _low_level_execute_command(): starting 8875 1726773038.51838: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/AnsiballZ_slurp.py && sleep 0' 8875 1726773038.66748: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8875 1726773038.67780: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8875 1726773038.67824: stderr chunk (state=3): >>><<< 8875 1726773038.67830: stdout chunk (state=3): >>><<< 8875 1726773038.67843: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 8875 1726773038.67865: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8875 1726773038.67874: _low_level_execute_command(): starting 8875 1726773038.67881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.2721014-8875-144238566710371/ > /dev/null 2>&1 && sleep 0' 8875 1726773038.70392: stderr chunk (state=2): >>><<< 8875 1726773038.70404: stdout chunk (state=2): >>><<< 8875 1726773038.70425: _low_level_execute_command() done: rc=0, stdout=, stderr= 8875 1726773038.70434: handler run complete 8875 1726773038.70453: attempt loop complete, returning result 8875 1726773038.70457: _execute() done 8875 1726773038.70461: dumping result to json 8875 1726773038.70467: done dumping result, returning 8875 1726773038.70474: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-885f-bbcf-0000000000ba] 8875 1726773038.70481: sending task result for task 0affffe7-6841-885f-bbcf-0000000000ba 8875 1726773038.70519: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000ba 8875 1726773038.70523: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773038.70746: no more pending results, returning what we have 8240 1726773038.70749: results queue empty 8240 1726773038.70750: checking for any_errors_fatal 8240 1726773038.70759: done checking for any_errors_fatal 8240 1726773038.70759: checking for max_fail_percentage 8240 1726773038.70760: done checking for max_fail_percentage 8240 1726773038.70761: checking to see if all hosts have failed and the running result is not ok 8240 1726773038.70762: done checking to see if all hosts have failed 8240 1726773038.70762: getting the remaining hosts for this loop 8240 1726773038.70763: done getting the remaining hosts for this loop 8240 1726773038.70766: getting the next task for host managed_node2 8240 1726773038.70772: done getting next task for host managed_node2 8240 1726773038.70775: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8240 1726773038.70777: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773038.70787: getting variables 8240 1726773038.70788: in VariableManager get_vars() 8240 1726773038.70815: Calling all_inventory to load vars for managed_node2 8240 1726773038.70817: Calling groups_inventory to load vars for managed_node2 8240 1726773038.70818: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773038.70825: Calling all_plugins_play to load vars for managed_node2 8240 1726773038.70827: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773038.70829: Calling groups_plugins_play to load vars for managed_node2 8240 1726773038.70934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773038.71094: done with get_vars() 8240 1726773038.71107: done getting variables 8240 1726773038.71150: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.480) 0:00:17.355 **** 8240 1726773038.71187: entering _queue_task() for managed_node2/set_fact 8240 1726773038.71368: worker is 1 (out of 1 available) 8240 1726773038.71381: exiting _queue_task() for managed_node2/set_fact 8240 1726773038.71395: done queuing things up, now waiting for results queue to drain 8240 1726773038.71397: waiting for pending results... 8906 1726773038.71603: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8906 1726773038.71740: in run() - task 0affffe7-6841-885f-bbcf-0000000000bb 8906 1726773038.71761: variable 'ansible_search_path' from source: unknown 8906 1726773038.71765: variable 'ansible_search_path' from source: unknown 8906 1726773038.71797: calling self._execute() 8906 1726773038.71872: variable 'ansible_host' from source: host vars for 'managed_node2' 8906 1726773038.71882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8906 1726773038.71891: variable 'omit' from source: magic vars 8906 1726773038.71989: variable 'omit' from source: magic vars 8906 1726773038.72035: variable 'omit' from source: magic vars 8906 1726773038.72373: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8906 1726773038.72390: variable '__cur_profile' from source: task vars 8906 1726773038.72529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8906 1726773038.74045: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8906 1726773038.74108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8906 1726773038.74137: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8906 1726773038.74164: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8906 1726773038.74188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8906 1726773038.74269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8906 1726773038.74298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8906 1726773038.74321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8906 1726773038.74365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8906 1726773038.74380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8906 1726773038.74471: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8906 1726773038.74513: variable 'omit' from source: magic vars 8906 1726773038.74534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8906 1726773038.74554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8906 1726773038.74567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8906 1726773038.74578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8906 1726773038.74587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8906 1726773038.74608: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8906 1726773038.74612: variable 'ansible_host' from source: host vars for 'managed_node2' 8906 1726773038.74614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8906 1726773038.74679: Set connection var ansible_pipelining to False 8906 1726773038.74707: Set connection var ansible_timeout to 10 8906 1726773038.74717: Set connection var ansible_module_compression to ZIP_DEFLATED 8906 1726773038.74721: Set connection var ansible_shell_type to sh 8906 1726773038.74727: Set connection var ansible_shell_executable to /bin/sh 8906 1726773038.74732: Set connection var ansible_connection to ssh 8906 1726773038.74756: variable 'ansible_shell_executable' from source: unknown 8906 1726773038.74761: variable 'ansible_connection' from source: unknown 8906 1726773038.74765: variable 'ansible_module_compression' from source: unknown 8906 1726773038.74768: variable 'ansible_shell_type' from source: unknown 8906 1726773038.74771: variable 'ansible_shell_executable' from source: unknown 8906 1726773038.74774: variable 'ansible_host' from source: host vars for 'managed_node2' 8906 1726773038.74778: variable 'ansible_pipelining' from source: unknown 8906 1726773038.74781: variable 'ansible_timeout' from source: unknown 8906 1726773038.74786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8906 1726773038.74870: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8906 1726773038.74882: variable 'omit' from source: magic vars 8906 1726773038.74890: starting attempt loop 8906 1726773038.74893: running the handler 8906 1726773038.74903: handler run complete 8906 1726773038.74912: attempt loop complete, returning result 8906 1726773038.74915: _execute() done 8906 1726773038.74917: dumping result to json 8906 1726773038.74920: done dumping result, returning 8906 1726773038.74926: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-885f-bbcf-0000000000bb] 8906 1726773038.74931: sending task result for task 0affffe7-6841-885f-bbcf-0000000000bb 8906 1726773038.74954: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000bb 8906 1726773038.74957: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8240 1726773038.75342: no more pending results, returning what we have 8240 1726773038.75344: results queue empty 8240 1726773038.75345: checking for any_errors_fatal 8240 1726773038.75350: done checking for any_errors_fatal 8240 1726773038.75350: checking for max_fail_percentage 8240 1726773038.75351: done checking for max_fail_percentage 8240 1726773038.75352: checking to see if all hosts have failed and the running result is not ok 8240 1726773038.75352: done checking to see if all hosts have failed 8240 1726773038.75352: getting the remaining hosts for this loop 8240 1726773038.75353: done getting the remaining hosts for this loop 8240 1726773038.75356: getting the next task for host managed_node2 8240 1726773038.75360: done getting next task for host managed_node2 8240 1726773038.75362: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8240 1726773038.75364: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773038.75375: getting variables 8240 1726773038.75376: in VariableManager get_vars() 8240 1726773038.75401: Calling all_inventory to load vars for managed_node2 8240 1726773038.75403: Calling groups_inventory to load vars for managed_node2 8240 1726773038.75404: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773038.75411: Calling all_plugins_play to load vars for managed_node2 8240 1726773038.75412: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773038.75414: Calling groups_plugins_play to load vars for managed_node2 8240 1726773038.75517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773038.75642: done with get_vars() 8240 1726773038.75651: done getting variables 8240 1726773038.75694: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.045) 0:00:17.401 **** 8240 1726773038.75716: entering _queue_task() for managed_node2/copy 8240 1726773038.75875: worker is 1 (out of 1 available) 8240 1726773038.75893: exiting _queue_task() for managed_node2/copy 8240 1726773038.75904: done queuing things up, now waiting for results queue to drain 8240 1726773038.75906: waiting for pending results... 8909 1726773038.76024: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8909 1726773038.76134: in run() - task 0affffe7-6841-885f-bbcf-0000000000bc 8909 1726773038.76150: variable 'ansible_search_path' from source: unknown 8909 1726773038.76154: variable 'ansible_search_path' from source: unknown 8909 1726773038.76183: calling self._execute() 8909 1726773038.76247: variable 'ansible_host' from source: host vars for 'managed_node2' 8909 1726773038.76256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8909 1726773038.76264: variable 'omit' from source: magic vars 8909 1726773038.76338: variable 'omit' from source: magic vars 8909 1726773038.76377: variable 'omit' from source: magic vars 8909 1726773038.76400: variable '__kernel_settings_active_profile' from source: set_fact 8909 1726773038.76617: variable '__kernel_settings_active_profile' from source: set_fact 8909 1726773038.76639: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8909 1726773038.76696: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8909 1726773038.76749: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8909 1726773038.76826: variable 'omit' from source: magic vars 8909 1726773038.76860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8909 1726773038.76892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8909 1726773038.76912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8909 1726773038.76927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8909 1726773038.76937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8909 1726773038.76960: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8909 1726773038.76968: variable 'ansible_host' from source: host vars for 'managed_node2' 8909 1726773038.76972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8909 1726773038.77044: Set connection var ansible_pipelining to False 8909 1726773038.77053: Set connection var ansible_timeout to 10 8909 1726773038.77061: Set connection var ansible_module_compression to ZIP_DEFLATED 8909 1726773038.77064: Set connection var ansible_shell_type to sh 8909 1726773038.77072: Set connection var ansible_shell_executable to /bin/sh 8909 1726773038.77077: Set connection var ansible_connection to ssh 8909 1726773038.77095: variable 'ansible_shell_executable' from source: unknown 8909 1726773038.77099: variable 'ansible_connection' from source: unknown 8909 1726773038.77102: variable 'ansible_module_compression' from source: unknown 8909 1726773038.77106: variable 'ansible_shell_type' from source: unknown 8909 1726773038.77109: variable 'ansible_shell_executable' from source: unknown 8909 1726773038.77111: variable 'ansible_host' from source: host vars for 'managed_node2' 8909 1726773038.77113: variable 'ansible_pipelining' from source: unknown 8909 1726773038.77115: variable 'ansible_timeout' from source: unknown 8909 1726773038.77117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8909 1726773038.77206: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8909 1726773038.77216: variable 'omit' from source: magic vars 8909 1726773038.77222: starting attempt loop 8909 1726773038.77224: running the handler 8909 1726773038.77234: _low_level_execute_command(): starting 8909 1726773038.77240: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8909 1726773038.79630: stdout chunk (state=2): >>>/root <<< 8909 1726773038.79755: stderr chunk (state=3): >>><<< 8909 1726773038.79764: stdout chunk (state=3): >>><<< 8909 1726773038.79790: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8909 1726773038.79804: _low_level_execute_command(): starting 8909 1726773038.79810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189 `" && echo ansible-tmp-1726773038.797986-8909-199124539978189="` echo /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189 `" ) && sleep 0' 8909 1726773038.82315: stdout chunk (state=2): >>>ansible-tmp-1726773038.797986-8909-199124539978189=/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189 <<< 8909 1726773038.82451: stderr chunk (state=3): >>><<< 8909 1726773038.82459: stdout chunk (state=3): >>><<< 8909 1726773038.82480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.797986-8909-199124539978189=/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189 , stderr= 8909 1726773038.82555: variable 'ansible_module_compression' from source: unknown 8909 1726773038.82604: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8909 1726773038.82633: variable 'ansible_facts' from source: unknown 8909 1726773038.82703: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_stat.py 8909 1726773038.82799: Sending initial data 8909 1726773038.82806: Sent initial data (150 bytes) 8909 1726773038.85327: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp2jc8fh16 /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_stat.py <<< 8909 1726773038.86475: stderr chunk (state=3): >>><<< 8909 1726773038.86484: stdout chunk (state=3): >>><<< 8909 1726773038.86504: done transferring module to remote 8909 1726773038.86515: _low_level_execute_command(): starting 8909 1726773038.86521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/ /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_stat.py && sleep 0' 8909 1726773038.88946: stderr chunk (state=2): >>><<< 8909 1726773038.88959: stdout chunk (state=2): >>><<< 8909 1726773038.88977: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773038.88982: _low_level_execute_command(): starting 8909 1726773038.88989: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_stat.py && sleep 0' 8909 1726773039.05036: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773038.666086, "mtime": 1726773032.395023, "ctime": 1726773032.395023, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8909 1726773039.06213: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8909 1726773039.06224: stdout chunk (state=3): >>><<< 8909 1726773039.06235: stderr chunk (state=3): >>><<< 8909 1726773039.06249: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773038.666086, "mtime": 1726773032.395023, "ctime": 1726773032.395023, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 8909 1726773039.06293: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8909 1726773039.06381: Sending initial data 8909 1726773039.06390: Sent initial data (139 bytes) 8909 1726773039.09157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmps3xhxv2r /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source <<< 8909 1726773039.10549: stderr chunk (state=3): >>><<< 8909 1726773039.10560: stdout chunk (state=3): >>><<< 8909 1726773039.10583: _low_level_execute_command(): starting 8909 1726773039.10591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/ /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source && sleep 0' 8909 1726773039.13693: stderr chunk (state=2): >>><<< 8909 1726773039.13707: stdout chunk (state=2): >>><<< 8909 1726773039.13726: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773039.13753: variable 'ansible_module_compression' from source: unknown 8909 1726773039.13805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8909 1726773039.13830: variable 'ansible_facts' from source: unknown 8909 1726773039.13917: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_copy.py 8909 1726773039.14205: Sending initial data 8909 1726773039.14212: Sent initial data (150 bytes) 8909 1726773039.16981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpotixugy4 /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_copy.py <<< 8909 1726773039.18716: stderr chunk (state=3): >>><<< 8909 1726773039.18729: stdout chunk (state=3): >>><<< 8909 1726773039.18755: done transferring module to remote 8909 1726773039.18770: _low_level_execute_command(): starting 8909 1726773039.18777: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/ /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_copy.py && sleep 0' 8909 1726773039.21441: stderr chunk (state=2): >>><<< 8909 1726773039.21452: stdout chunk (state=2): >>><<< 8909 1726773039.21471: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773039.21476: _low_level_execute_command(): starting 8909 1726773039.21481: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/AnsiballZ_copy.py && sleep 0' 8909 1726773039.37829: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source", "_original_basename": "tmps3xhxv2r", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8909 1726773039.39037: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8909 1726773039.39090: stderr chunk (state=3): >>><<< 8909 1726773039.39098: stdout chunk (state=3): >>><<< 8909 1726773039.39117: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source", "_original_basename": "tmps3xhxv2r", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8909 1726773039.39154: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source', '_original_basename': 'tmps3xhxv2r', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8909 1726773039.39169: _low_level_execute_command(): starting 8909 1726773039.39178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/ > /dev/null 2>&1 && sleep 0' 8909 1726773039.42193: stderr chunk (state=2): >>><<< 8909 1726773039.42206: stdout chunk (state=2): >>><<< 8909 1726773039.42224: _low_level_execute_command() done: rc=0, stdout=, stderr= 8909 1726773039.42235: handler run complete 8909 1726773039.42264: attempt loop complete, returning result 8909 1726773039.42273: _execute() done 8909 1726773039.42277: dumping result to json 8909 1726773039.42283: done dumping result, returning 8909 1726773039.42293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-885f-bbcf-0000000000bc] 8909 1726773039.42300: sending task result for task 0affffe7-6841-885f-bbcf-0000000000bc 8909 1726773039.42343: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000bc 8909 1726773039.42347: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726773038.797986-8909-199124539978189/source", "state": "file", "uid": 0 } 8240 1726773039.42800: no more pending results, returning what we have 8240 1726773039.42804: results queue empty 8240 1726773039.42805: checking for any_errors_fatal 8240 1726773039.42811: done checking for any_errors_fatal 8240 1726773039.42812: checking for max_fail_percentage 8240 1726773039.42815: done checking for max_fail_percentage 8240 1726773039.42816: checking to see if all hosts have failed and the running result is not ok 8240 1726773039.42816: done checking to see if all hosts have failed 8240 1726773039.42817: getting the remaining hosts for this loop 8240 1726773039.42818: done getting the remaining hosts for this loop 8240 1726773039.42821: getting the next task for host managed_node2 8240 1726773039.42827: done getting next task for host managed_node2 8240 1726773039.42830: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8240 1726773039.42832: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773039.42842: getting variables 8240 1726773039.42843: in VariableManager get_vars() 8240 1726773039.42876: Calling all_inventory to load vars for managed_node2 8240 1726773039.42879: Calling groups_inventory to load vars for managed_node2 8240 1726773039.42881: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773039.42892: Calling all_plugins_play to load vars for managed_node2 8240 1726773039.42895: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773039.42898: Calling groups_plugins_play to load vars for managed_node2 8240 1726773039.43118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773039.43320: done with get_vars() 8240 1726773039.43331: done getting variables 8240 1726773039.43393: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:10:39 -0400 (0:00:00.677) 0:00:18.078 **** 8240 1726773039.43423: entering _queue_task() for managed_node2/copy 8240 1726773039.43635: worker is 1 (out of 1 available) 8240 1726773039.43648: exiting _queue_task() for managed_node2/copy 8240 1726773039.43661: done queuing things up, now waiting for results queue to drain 8240 1726773039.43663: waiting for pending results... 8972 1726773039.43891: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8972 1726773039.44030: in run() - task 0affffe7-6841-885f-bbcf-0000000000bd 8972 1726773039.44051: variable 'ansible_search_path' from source: unknown 8972 1726773039.44056: variable 'ansible_search_path' from source: unknown 8972 1726773039.44092: calling self._execute() 8972 1726773039.44175: variable 'ansible_host' from source: host vars for 'managed_node2' 8972 1726773039.44187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8972 1726773039.44196: variable 'omit' from source: magic vars 8972 1726773039.44298: variable 'omit' from source: magic vars 8972 1726773039.44344: variable 'omit' from source: magic vars 8972 1726773039.44376: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8972 1726773039.44660: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8972 1726773039.44747: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8972 1726773039.44840: variable 'omit' from source: magic vars 8972 1726773039.44883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8972 1726773039.44922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8972 1726773039.44944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8972 1726773039.44962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8972 1726773039.44977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8972 1726773039.45009: variable 'inventory_hostname' from source: host vars for 'managed_node2' 8972 1726773039.45015: variable 'ansible_host' from source: host vars for 'managed_node2' 8972 1726773039.45020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8972 1726773039.45123: Set connection var ansible_pipelining to False 8972 1726773039.45132: Set connection var ansible_timeout to 10 8972 1726773039.45142: Set connection var ansible_module_compression to ZIP_DEFLATED 8972 1726773039.45146: Set connection var ansible_shell_type to sh 8972 1726773039.45151: Set connection var ansible_shell_executable to /bin/sh 8972 1726773039.45156: Set connection var ansible_connection to ssh 8972 1726773039.45179: variable 'ansible_shell_executable' from source: unknown 8972 1726773039.45186: variable 'ansible_connection' from source: unknown 8972 1726773039.45190: variable 'ansible_module_compression' from source: unknown 8972 1726773039.45194: variable 'ansible_shell_type' from source: unknown 8972 1726773039.45196: variable 'ansible_shell_executable' from source: unknown 8972 1726773039.45199: variable 'ansible_host' from source: host vars for 'managed_node2' 8972 1726773039.45203: variable 'ansible_pipelining' from source: unknown 8972 1726773039.45206: variable 'ansible_timeout' from source: unknown 8972 1726773039.45210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 8972 1726773039.45337: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8972 1726773039.45349: variable 'omit' from source: magic vars 8972 1726773039.45357: starting attempt loop 8972 1726773039.45360: running the handler 8972 1726773039.45375: _low_level_execute_command(): starting 8972 1726773039.45384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8972 1726773039.48501: stdout chunk (state=2): >>>/root <<< 8972 1726773039.48658: stderr chunk (state=3): >>><<< 8972 1726773039.48670: stdout chunk (state=3): >>><<< 8972 1726773039.48697: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8972 1726773039.48713: _low_level_execute_command(): starting 8972 1726773039.48721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462 `" && echo ansible-tmp-1726773039.4870634-8972-173403374190462="` echo /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462 `" ) && sleep 0' 8972 1726773039.52302: stdout chunk (state=2): >>>ansible-tmp-1726773039.4870634-8972-173403374190462=/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462 <<< 8972 1726773039.52978: stderr chunk (state=3): >>><<< 8972 1726773039.52990: stdout chunk (state=3): >>><<< 8972 1726773039.53010: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773039.4870634-8972-173403374190462=/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462 , stderr= 8972 1726773039.53101: variable 'ansible_module_compression' from source: unknown 8972 1726773039.53161: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8972 1726773039.53201: variable 'ansible_facts' from source: unknown 8972 1726773039.53302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_stat.py 8972 1726773039.54510: Sending initial data 8972 1726773039.54521: Sent initial data (151 bytes) 8972 1726773039.63699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp38fe9obn /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_stat.py <<< 8972 1726773039.71593: stderr chunk (state=3): >>><<< 8972 1726773039.71605: stdout chunk (state=3): >>><<< 8972 1726773039.71631: done transferring module to remote 8972 1726773039.71646: _low_level_execute_command(): starting 8972 1726773039.71652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/ /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_stat.py && sleep 0' 8972 1726773039.76681: stderr chunk (state=2): >>><<< 8972 1726773039.76695: stdout chunk (state=2): >>><<< 8972 1726773039.76714: _low_level_execute_command() done: rc=0, stdout=, stderr= 8972 1726773039.76720: _low_level_execute_command(): starting 8972 1726773039.76727: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_stat.py && sleep 0' 8972 1726773039.95194: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726773032.2670217, "mtime": 1726773032.395023, "ctime": 1726773032.395023, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8972 1726773039.96346: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8972 1726773039.96358: stdout chunk (state=3): >>><<< 8972 1726773039.96369: stderr chunk (state=3): >>><<< 8972 1726773039.96386: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726773032.2670217, "mtime": 1726773032.395023, "ctime": 1726773032.395023, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 8972 1726773039.96448: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8972 1726773039.96984: Sending initial data 8972 1726773039.96993: Sent initial data (140 bytes) 8972 1726773040.01493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpw6xsi7f_ /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source <<< 8972 1726773040.02593: stderr chunk (state=3): >>><<< 8972 1726773040.02604: stdout chunk (state=3): >>><<< 8972 1726773040.02628: _low_level_execute_command(): starting 8972 1726773040.02635: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/ /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source && sleep 0' 8972 1726773040.06693: stderr chunk (state=2): >>><<< 8972 1726773040.06708: stdout chunk (state=2): >>><<< 8972 1726773040.06726: _low_level_execute_command() done: rc=0, stdout=, stderr= 8972 1726773040.06756: variable 'ansible_module_compression' from source: unknown 8972 1726773040.06805: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8972 1726773040.06826: variable 'ansible_facts' from source: unknown 8972 1726773040.06912: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_copy.py 8972 1726773040.07519: Sending initial data 8972 1726773040.07527: Sent initial data (151 bytes) 8972 1726773040.10100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpo4_xkvp4 /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_copy.py <<< 8972 1726773040.11756: stderr chunk (state=3): >>><<< 8972 1726773040.11766: stdout chunk (state=3): >>><<< 8972 1726773040.11786: done transferring module to remote 8972 1726773040.11794: _low_level_execute_command(): starting 8972 1726773040.11798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/ /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_copy.py && sleep 0' 8972 1726773040.14228: stderr chunk (state=2): >>><<< 8972 1726773040.14240: stdout chunk (state=2): >>><<< 8972 1726773040.14260: _low_level_execute_command() done: rc=0, stdout=, stderr= 8972 1726773040.14265: _low_level_execute_command(): starting 8972 1726773040.14271: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/AnsiballZ_copy.py && sleep 0' 8972 1726773040.31917: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source", "_original_basename": "tmpw6xsi7f_", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8972 1726773040.33061: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 8972 1726773040.33075: stdout chunk (state=3): >>><<< 8972 1726773040.33090: stderr chunk (state=3): >>><<< 8972 1726773040.33107: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source", "_original_basename": "tmpw6xsi7f_", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 8972 1726773040.33146: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source', '_original_basename': 'tmpw6xsi7f_', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8972 1726773040.33159: _low_level_execute_command(): starting 8972 1726773040.33168: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/ > /dev/null 2>&1 && sleep 0' 8972 1726773040.39301: stderr chunk (state=2): >>><<< 8972 1726773040.39314: stdout chunk (state=2): >>><<< 8972 1726773040.39334: _low_level_execute_command() done: rc=0, stdout=, stderr= 8972 1726773040.39344: handler run complete 8972 1726773040.39377: attempt loop complete, returning result 8972 1726773040.39383: _execute() done 8972 1726773040.39389: dumping result to json 8972 1726773040.39395: done dumping result, returning 8972 1726773040.39404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-885f-bbcf-0000000000bd] 8972 1726773040.39411: sending task result for task 0affffe7-6841-885f-bbcf-0000000000bd 8972 1726773040.39454: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000bd 8972 1726773040.39459: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726773039.4870634-8972-173403374190462/source", "state": "file", "uid": 0 } 8240 1726773040.40376: no more pending results, returning what we have 8240 1726773040.40379: results queue empty 8240 1726773040.40380: checking for any_errors_fatal 8240 1726773040.40389: done checking for any_errors_fatal 8240 1726773040.40390: checking for max_fail_percentage 8240 1726773040.40391: done checking for max_fail_percentage 8240 1726773040.40392: checking to see if all hosts have failed and the running result is not ok 8240 1726773040.40393: done checking to see if all hosts have failed 8240 1726773040.40393: getting the remaining hosts for this loop 8240 1726773040.40395: done getting the remaining hosts for this loop 8240 1726773040.40410: getting the next task for host managed_node2 8240 1726773040.40416: done getting next task for host managed_node2 8240 1726773040.40419: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8240 1726773040.40422: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773040.40431: getting variables 8240 1726773040.40433: in VariableManager get_vars() 8240 1726773040.40465: Calling all_inventory to load vars for managed_node2 8240 1726773040.40468: Calling groups_inventory to load vars for managed_node2 8240 1726773040.40470: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773040.40479: Calling all_plugins_play to load vars for managed_node2 8240 1726773040.40482: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773040.40487: Calling groups_plugins_play to load vars for managed_node2 8240 1726773040.40647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773040.40841: done with get_vars() 8240 1726773040.40852: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:10:40 -0400 (0:00:00.975) 0:00:19.053 **** 8240 1726773040.40936: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773040.41134: worker is 1 (out of 1 available) 8240 1726773040.41147: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773040.41161: done queuing things up, now waiting for results queue to drain 8240 1726773040.41162: waiting for pending results... 9029 1726773040.42410: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 9029 1726773040.42553: in run() - task 0affffe7-6841-885f-bbcf-0000000000be 9029 1726773040.42575: variable 'ansible_search_path' from source: unknown 9029 1726773040.42581: variable 'ansible_search_path' from source: unknown 9029 1726773040.42618: calling self._execute() 9029 1726773040.42699: variable 'ansible_host' from source: host vars for 'managed_node2' 9029 1726773040.42709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9029 1726773040.42718: variable 'omit' from source: magic vars 9029 1726773040.42823: variable 'omit' from source: magic vars 9029 1726773040.42872: variable 'omit' from source: magic vars 9029 1726773040.42903: variable '__kernel_settings_profile_filename' from source: role '' all vars 9029 1726773040.43176: variable '__kernel_settings_profile_filename' from source: role '' all vars 9029 1726773040.43258: variable '__kernel_settings_profile_dir' from source: role '' all vars 9029 1726773040.44464: variable '__kernel_settings_profile_parent' from source: set_fact 9029 1726773040.44474: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9029 1726773040.44594: variable 'omit' from source: magic vars 9029 1726773040.44638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9029 1726773040.44672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9029 1726773040.44695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9029 1726773040.44718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9029 1726773040.44731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9029 1726773040.44768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9029 1726773040.44779: variable 'ansible_host' from source: host vars for 'managed_node2' 9029 1726773040.44784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9029 1726773040.44883: Set connection var ansible_pipelining to False 9029 1726773040.44894: Set connection var ansible_timeout to 10 9029 1726773040.44903: Set connection var ansible_module_compression to ZIP_DEFLATED 9029 1726773040.44907: Set connection var ansible_shell_type to sh 9029 1726773040.44912: Set connection var ansible_shell_executable to /bin/sh 9029 1726773040.44917: Set connection var ansible_connection to ssh 9029 1726773040.44938: variable 'ansible_shell_executable' from source: unknown 9029 1726773040.44943: variable 'ansible_connection' from source: unknown 9029 1726773040.44946: variable 'ansible_module_compression' from source: unknown 9029 1726773040.44950: variable 'ansible_shell_type' from source: unknown 9029 1726773040.44953: variable 'ansible_shell_executable' from source: unknown 9029 1726773040.44956: variable 'ansible_host' from source: host vars for 'managed_node2' 9029 1726773040.44959: variable 'ansible_pipelining' from source: unknown 9029 1726773040.44963: variable 'ansible_timeout' from source: unknown 9029 1726773040.44966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9029 1726773040.45143: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9029 1726773040.45154: variable 'omit' from source: magic vars 9029 1726773040.45161: starting attempt loop 9029 1726773040.45164: running the handler 9029 1726773040.45178: _low_level_execute_command(): starting 9029 1726773040.45189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9029 1726773040.48492: stdout chunk (state=2): >>>/root <<< 9029 1726773040.48503: stderr chunk (state=2): >>><<< 9029 1726773040.48516: stdout chunk (state=3): >>><<< 9029 1726773040.48531: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9029 1726773040.48545: _low_level_execute_command(): starting 9029 1726773040.48552: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955 `" && echo ansible-tmp-1726773040.4853976-9029-217480545845955="` echo /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955 `" ) && sleep 0' 9029 1726773040.52691: stdout chunk (state=2): >>>ansible-tmp-1726773040.4853976-9029-217480545845955=/root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955 <<< 9029 1726773040.52701: stderr chunk (state=2): >>><<< 9029 1726773040.52711: stdout chunk (state=3): >>><<< 9029 1726773040.52723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773040.4853976-9029-217480545845955=/root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955 , stderr= 9029 1726773040.52762: variable 'ansible_module_compression' from source: unknown 9029 1726773040.52799: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 9029 1726773040.52830: variable 'ansible_facts' from source: unknown 9029 1726773040.52900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/AnsiballZ_kernel_settings_get_config.py 9029 1726773040.53499: Sending initial data 9029 1726773040.53506: Sent initial data (173 bytes) 9029 1726773040.56913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpljo2ar0l /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/AnsiballZ_kernel_settings_get_config.py <<< 9029 1726773040.58274: stderr chunk (state=3): >>><<< 9029 1726773040.58288: stdout chunk (state=3): >>><<< 9029 1726773040.58313: done transferring module to remote 9029 1726773040.58326: _low_level_execute_command(): starting 9029 1726773040.58332: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/ /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9029 1726773040.61517: stderr chunk (state=2): >>><<< 9029 1726773040.61533: stdout chunk (state=2): >>><<< 9029 1726773040.61554: _low_level_execute_command() done: rc=0, stdout=, stderr= 9029 1726773040.61560: _low_level_execute_command(): starting 9029 1726773040.61565: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/AnsiballZ_kernel_settings_get_config.py && sleep 0' 9029 1726773040.77879: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 9029 1726773040.78949: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9029 1726773040.78997: stderr chunk (state=3): >>><<< 9029 1726773040.79005: stdout chunk (state=3): >>><<< 9029 1726773040.79023: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530"}, "sysfs": {"/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0", "/sys/kernel/debug/x86/ibrs_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 9029 1726773040.79050: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9029 1726773040.79061: _low_level_execute_command(): starting 9029 1726773040.79070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773040.4853976-9029-217480545845955/ > /dev/null 2>&1 && sleep 0' 9029 1726773040.81490: stderr chunk (state=2): >>><<< 9029 1726773040.81499: stdout chunk (state=2): >>><<< 9029 1726773040.81516: _low_level_execute_command() done: rc=0, stdout=, stderr= 9029 1726773040.81522: handler run complete 9029 1726773040.81539: attempt loop complete, returning result 9029 1726773040.81542: _execute() done 9029 1726773040.81544: dumping result to json 9029 1726773040.81547: done dumping result, returning 9029 1726773040.81553: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-885f-bbcf-0000000000be] 9029 1726773040.81558: sending task result for task 0affffe7-6841-885f-bbcf-0000000000be 9029 1726773040.81588: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000be 9029 1726773040.81592: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "379724", "kernel.threads-max": "29968", "vm.max_map_count": "65530" }, "sysfs": { "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8240 1726773040.81752: no more pending results, returning what we have 8240 1726773040.81756: results queue empty 8240 1726773040.81756: checking for any_errors_fatal 8240 1726773040.81762: done checking for any_errors_fatal 8240 1726773040.81763: checking for max_fail_percentage 8240 1726773040.81764: done checking for max_fail_percentage 8240 1726773040.81765: checking to see if all hosts have failed and the running result is not ok 8240 1726773040.81765: done checking to see if all hosts have failed 8240 1726773040.81766: getting the remaining hosts for this loop 8240 1726773040.81767: done getting the remaining hosts for this loop 8240 1726773040.81770: getting the next task for host managed_node2 8240 1726773040.81775: done getting next task for host managed_node2 8240 1726773040.81778: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8240 1726773040.81780: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773040.81791: getting variables 8240 1726773040.81792: in VariableManager get_vars() 8240 1726773040.81821: Calling all_inventory to load vars for managed_node2 8240 1726773040.81824: Calling groups_inventory to load vars for managed_node2 8240 1726773040.81826: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773040.81834: Calling all_plugins_play to load vars for managed_node2 8240 1726773040.81837: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773040.81839: Calling groups_plugins_play to load vars for managed_node2 8240 1726773040.81979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773040.82097: done with get_vars() 8240 1726773040.82106: done getting variables 8240 1726773040.82187: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:10:40 -0400 (0:00:00.412) 0:00:19.465 **** 8240 1726773040.82209: entering _queue_task() for managed_node2/template 8240 1726773040.82210: Creating lock for template 8240 1726773040.82368: worker is 1 (out of 1 available) 8240 1726773040.82381: exiting _queue_task() for managed_node2/template 8240 1726773040.82396: done queuing things up, now waiting for results queue to drain 8240 1726773040.82397: waiting for pending results... 9069 1726773040.82512: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 9069 1726773040.82621: in run() - task 0affffe7-6841-885f-bbcf-0000000000bf 9069 1726773040.82636: variable 'ansible_search_path' from source: unknown 9069 1726773040.82640: variable 'ansible_search_path' from source: unknown 9069 1726773040.82668: calling self._execute() 9069 1726773040.82735: variable 'ansible_host' from source: host vars for 'managed_node2' 9069 1726773040.82745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9069 1726773040.82754: variable 'omit' from source: magic vars 9069 1726773040.82826: variable 'omit' from source: magic vars 9069 1726773040.82871: variable 'omit' from source: magic vars 9069 1726773040.83104: variable '__kernel_settings_profile_src' from source: role '' all vars 9069 1726773040.83113: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9069 1726773040.83187: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9069 1726773040.83209: variable '__kernel_settings_profile_filename' from source: role '' all vars 9069 1726773040.83268: variable '__kernel_settings_profile_filename' from source: role '' all vars 9069 1726773040.83330: variable '__kernel_settings_profile_dir' from source: role '' all vars 9069 1726773040.83418: variable '__kernel_settings_profile_parent' from source: set_fact 9069 1726773040.83426: variable '__kernel_settings_tuned_profile' from source: role '' all vars 9069 1726773040.83453: variable 'omit' from source: magic vars 9069 1726773040.83492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9069 1726773040.83521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9069 1726773040.83539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9069 1726773040.83555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9069 1726773040.83568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9069 1726773040.83594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9069 1726773040.83599: variable 'ansible_host' from source: host vars for 'managed_node2' 9069 1726773040.83602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9069 1726773040.83690: Set connection var ansible_pipelining to False 9069 1726773040.83697: Set connection var ansible_timeout to 10 9069 1726773040.83704: Set connection var ansible_module_compression to ZIP_DEFLATED 9069 1726773040.83707: Set connection var ansible_shell_type to sh 9069 1726773040.83711: Set connection var ansible_shell_executable to /bin/sh 9069 1726773040.83716: Set connection var ansible_connection to ssh 9069 1726773040.83732: variable 'ansible_shell_executable' from source: unknown 9069 1726773040.83735: variable 'ansible_connection' from source: unknown 9069 1726773040.83738: variable 'ansible_module_compression' from source: unknown 9069 1726773040.83740: variable 'ansible_shell_type' from source: unknown 9069 1726773040.83743: variable 'ansible_shell_executable' from source: unknown 9069 1726773040.83745: variable 'ansible_host' from source: host vars for 'managed_node2' 9069 1726773040.83748: variable 'ansible_pipelining' from source: unknown 9069 1726773040.83750: variable 'ansible_timeout' from source: unknown 9069 1726773040.83753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9069 1726773040.83859: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9069 1726773040.83872: variable 'omit' from source: magic vars 9069 1726773040.83878: starting attempt loop 9069 1726773040.83882: running the handler 9069 1726773040.83894: _low_level_execute_command(): starting 9069 1726773040.83900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9069 1726773040.86222: stdout chunk (state=2): >>>/root <<< 9069 1726773040.86346: stderr chunk (state=3): >>><<< 9069 1726773040.86355: stdout chunk (state=3): >>><<< 9069 1726773040.86376: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9069 1726773040.86391: _low_level_execute_command(): starting 9069 1726773040.86398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629 `" && echo ansible-tmp-1726773040.8638358-9069-274450713132629="` echo /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629 `" ) && sleep 0' 9069 1726773040.88869: stdout chunk (state=2): >>>ansible-tmp-1726773040.8638358-9069-274450713132629=/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629 <<< 9069 1726773040.88999: stderr chunk (state=3): >>><<< 9069 1726773040.89006: stdout chunk (state=3): >>><<< 9069 1726773040.89021: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773040.8638358-9069-274450713132629=/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629 , stderr= 9069 1726773040.89039: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 9069 1726773040.89055: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 9069 1726773040.89074: variable 'ansible_search_path' from source: unknown 9069 1726773040.89683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9069 1726773040.91124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9069 1726773040.91361: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9069 1726773040.91396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9069 1726773040.91425: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9069 1726773040.91446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9069 1726773040.91639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9069 1726773040.91660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9069 1726773040.91683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9069 1726773040.91713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9069 1726773040.91724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9069 1726773040.91953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9069 1726773040.91975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9069 1726773040.91995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9069 1726773040.92020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9069 1726773040.92031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9069 1726773040.92278: variable 'ansible_managed' from source: unknown 9069 1726773040.92288: variable '__sections' from source: task vars 9069 1726773040.92374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9069 1726773040.92393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9069 1726773040.92410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9069 1726773040.92432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9069 1726773040.92440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9069 1726773040.92510: variable 'kernel_settings_sysctl' from source: include params 9069 1726773040.92518: variable '__kernel_settings_state_empty' from source: role '' all vars 9069 1726773040.92522: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9069 1726773040.92558: variable '__sysctl_old' from source: task vars 9069 1726773040.92609: variable '__sysctl_old' from source: task vars 9069 1726773040.92748: variable 'kernel_settings_purge' from source: role '' defaults 9069 1726773040.92754: variable 'kernel_settings_sysctl' from source: include params 9069 1726773040.92759: variable '__kernel_settings_state_empty' from source: role '' all vars 9069 1726773040.92762: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9069 1726773040.92764: variable '__kernel_settings_profile_contents' from source: set_fact 9069 1726773040.92918: variable 'kernel_settings_sysfs' from source: include params 9069 1726773040.92927: variable '__kernel_settings_state_empty' from source: role '' all vars 9069 1726773040.92932: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9069 1726773040.92951: variable '__sysfs_old' from source: task vars 9069 1726773040.92996: variable '__sysfs_old' from source: task vars 9069 1726773040.93135: variable 'kernel_settings_purge' from source: role '' defaults 9069 1726773040.93142: variable 'kernel_settings_sysfs' from source: include params 9069 1726773040.93149: variable '__kernel_settings_state_empty' from source: role '' all vars 9069 1726773040.93154: variable '__kernel_settings_previous_replaced' from source: role '' all vars 9069 1726773040.93158: variable '__kernel_settings_profile_contents' from source: set_fact 9069 1726773040.93192: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 9069 1726773040.93201: variable '__systemd_old' from source: task vars 9069 1726773040.93241: variable '__systemd_old' from source: task vars 9069 1726773040.93375: variable 'kernel_settings_purge' from source: role '' defaults 9069 1726773040.93382: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 9069 1726773040.93388: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93394: variable '__kernel_settings_profile_contents' from source: set_fact 9069 1726773040.93406: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 9069 1726773040.93410: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 9069 1726773040.93415: variable '__trans_huge_old' from source: task vars 9069 1726773040.93454: variable '__trans_huge_old' from source: task vars 9069 1726773040.93591: variable 'kernel_settings_purge' from source: role '' defaults 9069 1726773040.93598: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 9069 1726773040.93603: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93608: variable '__kernel_settings_profile_contents' from source: set_fact 9069 1726773040.93619: variable '__trans_defrag_old' from source: task vars 9069 1726773040.93658: variable '__trans_defrag_old' from source: task vars 9069 1726773040.93791: variable 'kernel_settings_purge' from source: role '' defaults 9069 1726773040.93797: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 9069 1726773040.93802: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93807: variable '__kernel_settings_profile_contents' from source: set_fact 9069 1726773040.93822: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93831: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93839: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93847: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93852: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93855: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93867: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93873: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93878: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93881: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93891: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93899: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93907: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93913: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.93921: variable '__kernel_settings_state_absent' from source: role '' all vars 9069 1726773040.94392: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9069 1726773040.94436: variable 'ansible_module_compression' from source: unknown 9069 1726773040.94477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9069 1726773040.94501: variable 'ansible_facts' from source: unknown 9069 1726773040.94568: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_stat.py 9069 1726773040.94662: Sending initial data 9069 1726773040.94672: Sent initial data (151 bytes) 9069 1726773040.97279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp6t_mlok7 /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_stat.py <<< 9069 1726773040.98513: stderr chunk (state=3): >>><<< 9069 1726773040.98526: stdout chunk (state=3): >>><<< 9069 1726773040.98551: done transferring module to remote 9069 1726773040.98565: _low_level_execute_command(): starting 9069 1726773040.98572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/ /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_stat.py && sleep 0' 9069 1726773041.01092: stderr chunk (state=2): >>><<< 9069 1726773041.01103: stdout chunk (state=2): >>><<< 9069 1726773041.01118: _low_level_execute_command() done: rc=0, stdout=, stderr= 9069 1726773041.01123: _low_level_execute_command(): starting 9069 1726773041.01128: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_stat.py && sleep 0' 9069 1726773041.17241: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 473956483, "dev": 51713, "nlink": 1, "atime": 1726773040.777107, "mtime": 1726773030.8490076, "ctime": 1726773031.2780118, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "1798742238", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 9069 1726773041.18439: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9069 1726773041.18452: stdout chunk (state=3): >>><<< 9069 1726773041.18467: stderr chunk (state=3): >>><<< 9069 1726773041.18481: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 381, "inode": 473956483, "dev": 51713, "nlink": 1, "atime": 1726773040.777107, "mtime": 1726773030.8490076, "ctime": 1726773031.2780118, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "13fdc203370e2b8e7e42c13d94b671b1ac621563", "mimetype": "text/plain", "charset": "us-ascii", "version": "1798742238", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 9069 1726773041.18527: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9069 1726773041.18647: Sending initial data 9069 1726773041.18658: Sent initial data (159 bytes) 9069 1726773041.21227: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpd4ejbukl/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source <<< 9069 1726773041.22092: stderr chunk (state=3): >>><<< 9069 1726773041.22101: stdout chunk (state=3): >>><<< 9069 1726773041.22119: _low_level_execute_command(): starting 9069 1726773041.22126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/ /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source && sleep 0' 9069 1726773041.24873: stderr chunk (state=2): >>><<< 9069 1726773041.24886: stdout chunk (state=2): >>><<< 9069 1726773041.24902: _low_level_execute_command() done: rc=0, stdout=, stderr= 9069 1726773041.24928: variable 'ansible_module_compression' from source: unknown 9069 1726773041.24972: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 9069 1726773041.24993: variable 'ansible_facts' from source: unknown 9069 1726773041.25054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_copy.py 9069 1726773041.25151: Sending initial data 9069 1726773041.25158: Sent initial data (151 bytes) 9069 1726773041.28007: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpt8lr3h5g /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_copy.py <<< 9069 1726773041.30215: stderr chunk (state=3): >>><<< 9069 1726773041.30230: stdout chunk (state=3): >>><<< 9069 1726773041.30254: done transferring module to remote 9069 1726773041.30264: _low_level_execute_command(): starting 9069 1726773041.30271: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/ /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_copy.py && sleep 0' 9069 1726773041.33743: stderr chunk (state=2): >>><<< 9069 1726773041.33754: stdout chunk (state=2): >>><<< 9069 1726773041.33773: _low_level_execute_command() done: rc=0, stdout=, stderr= 9069 1726773041.33779: _low_level_execute_command(): starting 9069 1726773041.33787: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/AnsiballZ_copy.py && sleep 0' 9069 1726773041.50954: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 9069 1726773041.52159: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9069 1726773041.52173: stdout chunk (state=3): >>><<< 9069 1726773041.52188: stderr chunk (state=3): >>><<< 9069 1726773041.52204: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9069 1726773041.52241: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3feaf86b2638623e3300792e683ce55f91f31e9a', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9069 1726773041.52272: _low_level_execute_command(): starting 9069 1726773041.52281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/ > /dev/null 2>&1 && sleep 0' 9069 1726773041.54967: stderr chunk (state=2): >>><<< 9069 1726773041.54978: stdout chunk (state=2): >>><<< 9069 1726773041.55000: _low_level_execute_command() done: rc=0, stdout=, stderr= 9069 1726773041.55012: handler run complete 9069 1726773041.55040: attempt loop complete, returning result 9069 1726773041.55048: _execute() done 9069 1726773041.55051: dumping result to json 9069 1726773041.55057: done dumping result, returning 9069 1726773041.55065: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-885f-bbcf-0000000000bf] 9069 1726773041.55072: sending task result for task 0affffe7-6841-885f-bbcf-0000000000bf 9069 1726773041.55140: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000bf 9069 1726773041.55144: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "ba6aeb244f15eac2bcf15c0dd41fdff5", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726773040.8638358-9069-274450713132629/source", "state": "file", "uid": 0 } 8240 1726773041.55612: no more pending results, returning what we have 8240 1726773041.55615: results queue empty 8240 1726773041.55616: checking for any_errors_fatal 8240 1726773041.55623: done checking for any_errors_fatal 8240 1726773041.55623: checking for max_fail_percentage 8240 1726773041.55625: done checking for max_fail_percentage 8240 1726773041.55625: checking to see if all hosts have failed and the running result is not ok 8240 1726773041.55626: done checking to see if all hosts have failed 8240 1726773041.55627: getting the remaining hosts for this loop 8240 1726773041.55628: done getting the remaining hosts for this loop 8240 1726773041.55633: getting the next task for host managed_node2 8240 1726773041.55638: done getting next task for host managed_node2 8240 1726773041.55641: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8240 1726773041.55643: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773041.55653: getting variables 8240 1726773041.55654: in VariableManager get_vars() 8240 1726773041.55687: Calling all_inventory to load vars for managed_node2 8240 1726773041.55690: Calling groups_inventory to load vars for managed_node2 8240 1726773041.55692: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773041.55702: Calling all_plugins_play to load vars for managed_node2 8240 1726773041.55705: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773041.55708: Calling groups_plugins_play to load vars for managed_node2 8240 1726773041.55868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773041.56066: done with get_vars() 8240 1726773041.56077: done getting variables 8240 1726773041.56136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:10:41 -0400 (0:00:00.739) 0:00:20.205 **** 8240 1726773041.56167: entering _queue_task() for managed_node2/service 8240 1726773041.56367: worker is 1 (out of 1 available) 8240 1726773041.56381: exiting _queue_task() for managed_node2/service 8240 1726773041.56394: done queuing things up, now waiting for results queue to drain 8240 1726773041.56396: waiting for pending results... 9115 1726773041.57118: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 9115 1726773041.57259: in run() - task 0affffe7-6841-885f-bbcf-0000000000c0 9115 1726773041.57280: variable 'ansible_search_path' from source: unknown 9115 1726773041.57284: variable 'ansible_search_path' from source: unknown 9115 1726773041.57324: variable '__kernel_settings_services' from source: include_vars 9115 1726773041.57709: variable '__kernel_settings_services' from source: include_vars 9115 1726773041.57779: variable 'omit' from source: magic vars 9115 1726773041.57875: variable 'ansible_host' from source: host vars for 'managed_node2' 9115 1726773041.57888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9115 1726773041.57902: variable 'omit' from source: magic vars 9115 1726773041.58183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9115 1726773041.58382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9115 1726773041.58431: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9115 1726773041.58475: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9115 1726773041.58512: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9115 1726773041.58616: variable '__kernel_settings_register_profile' from source: set_fact 9115 1726773041.58627: variable '__kernel_settings_register_mode' from source: set_fact 9115 1726773041.58646: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 9115 1726773041.58653: variable 'omit' from source: magic vars 9115 1726773041.58699: variable 'omit' from source: magic vars 9115 1726773041.58745: variable 'item' from source: unknown 9115 1726773041.58815: variable 'item' from source: unknown 9115 1726773041.58837: variable 'omit' from source: magic vars 9115 1726773041.58863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9115 1726773041.58897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9115 1726773041.58916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9115 1726773041.58933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9115 1726773041.58943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9115 1726773041.58974: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9115 1726773041.58980: variable 'ansible_host' from source: host vars for 'managed_node2' 9115 1726773041.58987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9115 1726773041.59082: Set connection var ansible_pipelining to False 9115 1726773041.59092: Set connection var ansible_timeout to 10 9115 1726773041.59101: Set connection var ansible_module_compression to ZIP_DEFLATED 9115 1726773041.59104: Set connection var ansible_shell_type to sh 9115 1726773041.59109: Set connection var ansible_shell_executable to /bin/sh 9115 1726773041.59115: Set connection var ansible_connection to ssh 9115 1726773041.59133: variable 'ansible_shell_executable' from source: unknown 9115 1726773041.59139: variable 'ansible_connection' from source: unknown 9115 1726773041.59142: variable 'ansible_module_compression' from source: unknown 9115 1726773041.59145: variable 'ansible_shell_type' from source: unknown 9115 1726773041.59148: variable 'ansible_shell_executable' from source: unknown 9115 1726773041.59151: variable 'ansible_host' from source: host vars for 'managed_node2' 9115 1726773041.59155: variable 'ansible_pipelining' from source: unknown 9115 1726773041.59158: variable 'ansible_timeout' from source: unknown 9115 1726773041.59162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9115 1726773041.59259: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9115 1726773041.59272: variable 'omit' from source: magic vars 9115 1726773041.59279: starting attempt loop 9115 1726773041.59283: running the handler 9115 1726773041.59362: variable 'ansible_facts' from source: unknown 9115 1726773041.59560: _low_level_execute_command(): starting 9115 1726773041.59570: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9115 1726773041.62656: stdout chunk (state=2): >>>/root <<< 9115 1726773041.62674: stderr chunk (state=2): >>><<< 9115 1726773041.62690: stdout chunk (state=3): >>><<< 9115 1726773041.62708: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9115 1726773041.62723: _low_level_execute_command(): starting 9115 1726773041.62730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242 `" && echo ansible-tmp-1726773041.6271691-9115-190479969626242="` echo /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242 `" ) && sleep 0' 9115 1726773041.65955: stdout chunk (state=2): >>>ansible-tmp-1726773041.6271691-9115-190479969626242=/root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242 <<< 9115 1726773041.66011: stderr chunk (state=3): >>><<< 9115 1726773041.66023: stdout chunk (state=3): >>><<< 9115 1726773041.66041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773041.6271691-9115-190479969626242=/root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242 , stderr= 9115 1726773041.66067: variable 'ansible_module_compression' from source: unknown 9115 1726773041.66135: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 9115 1726773041.66203: variable 'ansible_facts' from source: unknown 9115 1726773041.66445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/AnsiballZ_systemd.py 9115 1726773041.67023: Sending initial data 9115 1726773041.67030: Sent initial data (154 bytes) 9115 1726773041.69628: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp7nm_0utz /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/AnsiballZ_systemd.py <<< 9115 1726773041.72261: stderr chunk (state=3): >>><<< 9115 1726773041.72273: stdout chunk (state=3): >>><<< 9115 1726773041.72303: done transferring module to remote 9115 1726773041.72317: _low_level_execute_command(): starting 9115 1726773041.72323: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/ /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/AnsiballZ_systemd.py && sleep 0' 9115 1726773041.74993: stderr chunk (state=2): >>><<< 9115 1726773041.75005: stdout chunk (state=2): >>><<< 9115 1726773041.75022: _low_level_execute_command() done: rc=0, stdout=, stderr= 9115 1726773041.75027: _low_level_execute_command(): starting 9115 1726773041.75032: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/AnsiballZ_systemd.py && sleep 0' 9115 1726773042.26893: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8454", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15036416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9115 1726773042.28655: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9115 1726773042.28668: stdout chunk (state=3): >>><<< 9115 1726773042.28680: stderr chunk (state=3): >>><<< 9115 1726773042.28702: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "8454", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15036416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "Before": "multi-user.target shutdown.target", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9115 1726773042.28864: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9115 1726773042.28889: _low_level_execute_command(): starting 9115 1726773042.28896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773041.6271691-9115-190479969626242/ > /dev/null 2>&1 && sleep 0' 9115 1726773042.31635: stderr chunk (state=2): >>><<< 9115 1726773042.31647: stdout chunk (state=2): >>><<< 9115 1726773042.31664: _low_level_execute_command() done: rc=0, stdout=, stderr= 9115 1726773042.31672: handler run complete 9115 1726773042.31727: attempt loop complete, returning result 9115 1726773042.31748: variable 'item' from source: unknown 9115 1726773042.31825: variable 'item' from source: unknown changed: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveEnterTimestampMonotonic": "428428215", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ActiveExitTimestampMonotonic": "428067727", "ActiveState": "active", "After": "sysinit.target systemd-journald.socket basic.target system.slice dbus.service network.target dbus.socket polkit.service systemd-sysctl.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:32 EDT", "AssertTimestampMonotonic": "428181987", "Before": "multi-user.target shutdown.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ConditionTimestampMonotonic": "428181985", "ConfigurationDirectoryMode": "0755", "Conflicts": "auto-cpufreq.service tlp.service shutdown.target cpupower.service power-profiles-daemon.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "8454", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:32 EDT", "ExecMainStartTimestampMonotonic": "428183847", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:32 EDT] ; stop_time=[n/a] ; pid=8454 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveEnterTimestampMonotonic": "428180887", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:32 EDT", "InactiveExitTimestampMonotonic": "428183898", "InvocationID": "e777597018b64b11af33cf8cce2131bb", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "8454", "MemoryAccounting": "yes", "MemoryCurrent": "15036416", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "dbus.service dbus.socket system.slice sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:32 EDT", "StateChangeTimestampMonotonic": "428428215", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:32 EDT", "WatchdogTimestampMonotonic": "428428211", "WatchdogUSec": "0" } } 9115 1726773042.31955: dumping result to json 9115 1726773042.31973: done dumping result, returning 9115 1726773042.31983: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-885f-bbcf-0000000000c0] 9115 1726773042.31992: sending task result for task 0affffe7-6841-885f-bbcf-0000000000c0 9115 1726773042.32093: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000c0 9115 1726773042.32097: WORKER PROCESS EXITING 8240 1726773042.32908: no more pending results, returning what we have 8240 1726773042.32912: results queue empty 8240 1726773042.32913: checking for any_errors_fatal 8240 1726773042.32919: done checking for any_errors_fatal 8240 1726773042.32920: checking for max_fail_percentage 8240 1726773042.32921: done checking for max_fail_percentage 8240 1726773042.32922: checking to see if all hosts have failed and the running result is not ok 8240 1726773042.32923: done checking to see if all hosts have failed 8240 1726773042.32923: getting the remaining hosts for this loop 8240 1726773042.32924: done getting the remaining hosts for this loop 8240 1726773042.32927: getting the next task for host managed_node2 8240 1726773042.32931: done getting next task for host managed_node2 8240 1726773042.32934: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8240 1726773042.32936: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773042.32945: getting variables 8240 1726773042.32946: in VariableManager get_vars() 8240 1726773042.32970: Calling all_inventory to load vars for managed_node2 8240 1726773042.32973: Calling groups_inventory to load vars for managed_node2 8240 1726773042.32975: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773042.32984: Calling all_plugins_play to load vars for managed_node2 8240 1726773042.32988: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773042.32991: Calling groups_plugins_play to load vars for managed_node2 8240 1726773042.33142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773042.33359: done with get_vars() 8240 1726773042.33369: done getting variables 8240 1726773042.33455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:10:42 -0400 (0:00:00.773) 0:00:20.978 **** 8240 1726773042.33483: entering _queue_task() for managed_node2/command 8240 1726773042.33486: Creating lock for command 8240 1726773042.33694: worker is 1 (out of 1 available) 8240 1726773042.33706: exiting _queue_task() for managed_node2/command 8240 1726773042.33718: done queuing things up, now waiting for results queue to drain 8240 1726773042.33720: waiting for pending results... 9149 1726773042.33937: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 9149 1726773042.34068: in run() - task 0affffe7-6841-885f-bbcf-0000000000c1 9149 1726773042.34087: variable 'ansible_search_path' from source: unknown 9149 1726773042.34091: variable 'ansible_search_path' from source: unknown 9149 1726773042.34123: calling self._execute() 9149 1726773042.34198: variable 'ansible_host' from source: host vars for 'managed_node2' 9149 1726773042.34208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9149 1726773042.34216: variable 'omit' from source: magic vars 9149 1726773042.34645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9149 1726773042.34933: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9149 1726773042.34977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9149 1726773042.35010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9149 1726773042.35042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9149 1726773042.35148: variable '__kernel_settings_register_profile' from source: set_fact 9149 1726773042.35175: Evaluated conditional (not __kernel_settings_register_profile is changed): False 9149 1726773042.35181: when evaluation is False, skipping this task 9149 1726773042.35187: _execute() done 9149 1726773042.35191: dumping result to json 9149 1726773042.35195: done dumping result, returning 9149 1726773042.35200: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-885f-bbcf-0000000000c1] 9149 1726773042.35205: sending task result for task 0affffe7-6841-885f-bbcf-0000000000c1 9149 1726773042.35233: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000c1 9149 1726773042.35236: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 8240 1726773042.35577: no more pending results, returning what we have 8240 1726773042.35580: results queue empty 8240 1726773042.35581: checking for any_errors_fatal 8240 1726773042.35598: done checking for any_errors_fatal 8240 1726773042.35599: checking for max_fail_percentage 8240 1726773042.35600: done checking for max_fail_percentage 8240 1726773042.35601: checking to see if all hosts have failed and the running result is not ok 8240 1726773042.35602: done checking to see if all hosts have failed 8240 1726773042.35602: getting the remaining hosts for this loop 8240 1726773042.35603: done getting the remaining hosts for this loop 8240 1726773042.35606: getting the next task for host managed_node2 8240 1726773042.35611: done getting next task for host managed_node2 8240 1726773042.35615: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8240 1726773042.35618: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773042.35630: getting variables 8240 1726773042.35632: in VariableManager get_vars() 8240 1726773042.35674: Calling all_inventory to load vars for managed_node2 8240 1726773042.35677: Calling groups_inventory to load vars for managed_node2 8240 1726773042.35679: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773042.35689: Calling all_plugins_play to load vars for managed_node2 8240 1726773042.35692: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773042.35694: Calling groups_plugins_play to load vars for managed_node2 8240 1726773042.35856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773042.36057: done with get_vars() 8240 1726773042.36068: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:10:42 -0400 (0:00:00.026) 0:00:21.005 **** 8240 1726773042.36157: entering _queue_task() for managed_node2/include_tasks 8240 1726773042.36352: worker is 1 (out of 1 available) 8240 1726773042.36364: exiting _queue_task() for managed_node2/include_tasks 8240 1726773042.36377: done queuing things up, now waiting for results queue to drain 8240 1726773042.36379: waiting for pending results... 9150 1726773042.37097: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 9150 1726773042.37229: in run() - task 0affffe7-6841-885f-bbcf-0000000000c2 9150 1726773042.37248: variable 'ansible_search_path' from source: unknown 9150 1726773042.37252: variable 'ansible_search_path' from source: unknown 9150 1726773042.37284: calling self._execute() 9150 1726773042.37362: variable 'ansible_host' from source: host vars for 'managed_node2' 9150 1726773042.37372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9150 1726773042.37382: variable 'omit' from source: magic vars 9150 1726773042.37805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9150 1726773042.38144: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9150 1726773042.38261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9150 1726773042.38296: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9150 1726773042.38327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9150 1726773042.38424: variable '__kernel_settings_register_apply' from source: set_fact 9150 1726773042.38451: Evaluated conditional (__kernel_settings_register_apply is changed): True 9150 1726773042.38458: _execute() done 9150 1726773042.38462: dumping result to json 9150 1726773042.38466: done dumping result, returning 9150 1726773042.38472: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-885f-bbcf-0000000000c2] 9150 1726773042.38477: sending task result for task 0affffe7-6841-885f-bbcf-0000000000c2 9150 1726773042.38506: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000c2 9150 1726773042.38510: WORKER PROCESS EXITING 8240 1726773042.38856: no more pending results, returning what we have 8240 1726773042.38860: in VariableManager get_vars() 8240 1726773042.38898: Calling all_inventory to load vars for managed_node2 8240 1726773042.38902: Calling groups_inventory to load vars for managed_node2 8240 1726773042.38904: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773042.38913: Calling all_plugins_play to load vars for managed_node2 8240 1726773042.38916: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773042.38919: Calling groups_plugins_play to load vars for managed_node2 8240 1726773042.39082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773042.39327: done with get_vars() 8240 1726773042.39334: variable 'ansible_search_path' from source: unknown 8240 1726773042.39335: variable 'ansible_search_path' from source: unknown 8240 1726773042.39368: we have included files to process 8240 1726773042.39370: generating all_blocks data 8240 1726773042.39371: done generating all_blocks data 8240 1726773042.39376: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773042.39377: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773042.39379: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8240 1726773042.39810: done processing included file 8240 1726773042.39812: iterating over new_blocks loaded from include file 8240 1726773042.39813: in VariableManager get_vars() 8240 1726773042.39834: done with get_vars() 8240 1726773042.39838: filtering new block on tags 8240 1726773042.39895: done filtering new block on tags 8240 1726773042.39898: done iterating over new_blocks loaded from include file 8240 1726773042.39899: extending task lists for all hosts with included blocks 8240 1726773042.40543: done extending task lists 8240 1726773042.40545: done processing included files 8240 1726773042.40545: results queue empty 8240 1726773042.40546: checking for any_errors_fatal 8240 1726773042.40549: done checking for any_errors_fatal 8240 1726773042.40549: checking for max_fail_percentage 8240 1726773042.40550: done checking for max_fail_percentage 8240 1726773042.40551: checking to see if all hosts have failed and the running result is not ok 8240 1726773042.40552: done checking to see if all hosts have failed 8240 1726773042.40552: getting the remaining hosts for this loop 8240 1726773042.40553: done getting the remaining hosts for this loop 8240 1726773042.40556: getting the next task for host managed_node2 8240 1726773042.40560: done getting next task for host managed_node2 8240 1726773042.40562: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8240 1726773042.40564: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773042.40573: getting variables 8240 1726773042.40574: in VariableManager get_vars() 8240 1726773042.40588: Calling all_inventory to load vars for managed_node2 8240 1726773042.40590: Calling groups_inventory to load vars for managed_node2 8240 1726773042.40592: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773042.40597: Calling all_plugins_play to load vars for managed_node2 8240 1726773042.40599: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773042.40601: Calling groups_plugins_play to load vars for managed_node2 8240 1726773042.40929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773042.41124: done with get_vars() 8240 1726773042.41133: done getting variables 8240 1726773042.41170: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:10:42 -0400 (0:00:00.050) 0:00:21.055 **** 8240 1726773042.41201: entering _queue_task() for managed_node2/command 8240 1726773042.41410: worker is 1 (out of 1 available) 8240 1726773042.41421: exiting _queue_task() for managed_node2/command 8240 1726773042.41433: done queuing things up, now waiting for results queue to drain 8240 1726773042.41435: waiting for pending results... 9151 1726773042.41789: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 9151 1726773042.41944: in run() - task 0affffe7-6841-885f-bbcf-0000000001c9 9151 1726773042.41961: variable 'ansible_search_path' from source: unknown 9151 1726773042.41966: variable 'ansible_search_path' from source: unknown 9151 1726773042.41999: calling self._execute() 9151 1726773042.42077: variable 'ansible_host' from source: host vars for 'managed_node2' 9151 1726773042.42089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9151 1726773042.42099: variable 'omit' from source: magic vars 9151 1726773042.42192: variable 'omit' from source: magic vars 9151 1726773042.42251: variable 'omit' from source: magic vars 9151 1726773042.42283: variable 'omit' from source: magic vars 9151 1726773042.42325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9151 1726773042.42360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9151 1726773042.42381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9151 1726773042.42401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9151 1726773042.42413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9151 1726773042.42440: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9151 1726773042.42446: variable 'ansible_host' from source: host vars for 'managed_node2' 9151 1726773042.42450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9151 1726773042.42548: Set connection var ansible_pipelining to False 9151 1726773042.42556: Set connection var ansible_timeout to 10 9151 1726773042.42564: Set connection var ansible_module_compression to ZIP_DEFLATED 9151 1726773042.42568: Set connection var ansible_shell_type to sh 9151 1726773042.42573: Set connection var ansible_shell_executable to /bin/sh 9151 1726773042.42578: Set connection var ansible_connection to ssh 9151 1726773042.42599: variable 'ansible_shell_executable' from source: unknown 9151 1726773042.42604: variable 'ansible_connection' from source: unknown 9151 1726773042.42608: variable 'ansible_module_compression' from source: unknown 9151 1726773042.42611: variable 'ansible_shell_type' from source: unknown 9151 1726773042.42614: variable 'ansible_shell_executable' from source: unknown 9151 1726773042.42617: variable 'ansible_host' from source: host vars for 'managed_node2' 9151 1726773042.42620: variable 'ansible_pipelining' from source: unknown 9151 1726773042.42623: variable 'ansible_timeout' from source: unknown 9151 1726773042.42626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9151 1726773042.42843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9151 1726773042.42856: variable 'omit' from source: magic vars 9151 1726773042.42862: starting attempt loop 9151 1726773042.42866: running the handler 9151 1726773042.42880: _low_level_execute_command(): starting 9151 1726773042.42891: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9151 1726773042.46009: stdout chunk (state=2): >>>/root <<< 9151 1726773042.46126: stderr chunk (state=3): >>><<< 9151 1726773042.46134: stdout chunk (state=3): >>><<< 9151 1726773042.46159: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9151 1726773042.46175: _low_level_execute_command(): starting 9151 1726773042.46183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478 `" && echo ansible-tmp-1726773042.4616845-9151-95046847062478="` echo /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478 `" ) && sleep 0' 9151 1726773042.49317: stdout chunk (state=2): >>>ansible-tmp-1726773042.4616845-9151-95046847062478=/root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478 <<< 9151 1726773042.49332: stderr chunk (state=2): >>><<< 9151 1726773042.49344: stdout chunk (state=3): >>><<< 9151 1726773042.49357: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773042.4616845-9151-95046847062478=/root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478 , stderr= 9151 1726773042.49388: variable 'ansible_module_compression' from source: unknown 9151 1726773042.49452: ANSIBALLZ: Using generic lock for ansible.legacy.command 9151 1726773042.49458: ANSIBALLZ: Acquiring lock 9151 1726773042.49462: ANSIBALLZ: Lock acquired: 139787572477392 9151 1726773042.49465: ANSIBALLZ: Creating module 9151 1726773042.63590: ANSIBALLZ: Writing module into payload 9151 1726773042.63706: ANSIBALLZ: Writing module 9151 1726773042.63731: ANSIBALLZ: Renaming module 9151 1726773042.63739: ANSIBALLZ: Done creating module 9151 1726773042.63755: variable 'ansible_facts' from source: unknown 9151 1726773042.63844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/AnsiballZ_command.py 9151 1726773042.64321: Sending initial data 9151 1726773042.64328: Sent initial data (153 bytes) 9151 1726773042.66932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpm1860hf1 /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/AnsiballZ_command.py <<< 9151 1726773042.68391: stderr chunk (state=3): >>><<< 9151 1726773042.68402: stdout chunk (state=3): >>><<< 9151 1726773042.68427: done transferring module to remote 9151 1726773042.68442: _low_level_execute_command(): starting 9151 1726773042.68449: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/ /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/AnsiballZ_command.py && sleep 0' 9151 1726773042.71894: stderr chunk (state=2): >>><<< 9151 1726773042.71906: stdout chunk (state=2): >>><<< 9151 1726773042.71924: _low_level_execute_command() done: rc=0, stdout=, stderr= 9151 1726773042.71930: _low_level_execute_command(): starting 9151 1726773042.71935: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/AnsiballZ_command.py && sleep 0' 9151 1726773043.00527: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:42.875842", "end": "2024-09-19 15:10:43.003270", "delta": "0:00:00.127428", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9151 1726773043.01734: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9151 1726773043.01746: stdout chunk (state=3): >>><<< 9151 1726773043.01757: stderr chunk (state=3): >>><<< 9151 1726773043.01774: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:42.875842", "end": "2024-09-19 15:10:43.003270", "delta": "0:00:00.127428", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9151 1726773043.01825: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9151 1726773043.01837: _low_level_execute_command(): starting 9151 1726773043.01843: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773042.4616845-9151-95046847062478/ > /dev/null 2>&1 && sleep 0' 9151 1726773043.04603: stderr chunk (state=2): >>><<< 9151 1726773043.04614: stdout chunk (state=2): >>><<< 9151 1726773043.04631: _low_level_execute_command() done: rc=0, stdout=, stderr= 9151 1726773043.04639: handler run complete 9151 1726773043.04664: Evaluated conditional (False): False 9151 1726773043.04680: attempt loop complete, returning result 9151 1726773043.04684: _execute() done 9151 1726773043.04689: dumping result to json 9151 1726773043.04695: done dumping result, returning 9151 1726773043.04702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-885f-bbcf-0000000001c9] 9151 1726773043.04708: sending task result for task 0affffe7-6841-885f-bbcf-0000000001c9 9151 1726773043.04749: done sending task result for task 0affffe7-6841-885f-bbcf-0000000001c9 9151 1726773043.04753: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.127428", "end": "2024-09-19 15:10:43.003270", "rc": 0, "start": "2024-09-19 15:10:42.875842" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773043.05231: no more pending results, returning what we have 8240 1726773043.05234: results queue empty 8240 1726773043.05235: checking for any_errors_fatal 8240 1726773043.05237: done checking for any_errors_fatal 8240 1726773043.05237: checking for max_fail_percentage 8240 1726773043.05238: done checking for max_fail_percentage 8240 1726773043.05239: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.05240: done checking to see if all hosts have failed 8240 1726773043.05240: getting the remaining hosts for this loop 8240 1726773043.05241: done getting the remaining hosts for this loop 8240 1726773043.05245: getting the next task for host managed_node2 8240 1726773043.05252: done getting next task for host managed_node2 8240 1726773043.05255: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8240 1726773043.05259: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.05271: getting variables 8240 1726773043.05272: in VariableManager get_vars() 8240 1726773043.05305: Calling all_inventory to load vars for managed_node2 8240 1726773043.05308: Calling groups_inventory to load vars for managed_node2 8240 1726773043.05310: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.05319: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.05322: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.05325: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.05497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.05709: done with get_vars() 8240 1726773043.05719: done getting variables 8240 1726773043.05808: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.646) 0:00:21.702 **** 8240 1726773043.05840: entering _queue_task() for managed_node2/shell 8240 1726773043.05842: Creating lock for shell 8240 1726773043.06058: worker is 1 (out of 1 available) 8240 1726773043.06074: exiting _queue_task() for managed_node2/shell 8240 1726773043.06086: done queuing things up, now waiting for results queue to drain 8240 1726773043.06088: waiting for pending results... 9200 1726773043.06603: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 9200 1726773043.06770: in run() - task 0affffe7-6841-885f-bbcf-0000000001ca 9200 1726773043.06790: variable 'ansible_search_path' from source: unknown 9200 1726773043.06795: variable 'ansible_search_path' from source: unknown 9200 1726773043.06827: calling self._execute() 9200 1726773043.06911: variable 'ansible_host' from source: host vars for 'managed_node2' 9200 1726773043.06921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9200 1726773043.06929: variable 'omit' from source: magic vars 9200 1726773043.07442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9200 1726773043.07723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9200 1726773043.07765: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9200 1726773043.07803: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9200 1726773043.07835: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9200 1726773043.07942: variable '__kernel_settings_register_verify_values' from source: set_fact 9200 1726773043.07971: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 9200 1726773043.07977: when evaluation is False, skipping this task 9200 1726773043.07981: _execute() done 9200 1726773043.07984: dumping result to json 9200 1726773043.07990: done dumping result, returning 9200 1726773043.07995: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-885f-bbcf-0000000001ca] 9200 1726773043.08001: sending task result for task 0affffe7-6841-885f-bbcf-0000000001ca 9200 1726773043.08030: done sending task result for task 0affffe7-6841-885f-bbcf-0000000001ca 9200 1726773043.08034: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773043.08390: no more pending results, returning what we have 8240 1726773043.08394: results queue empty 8240 1726773043.08395: checking for any_errors_fatal 8240 1726773043.08402: done checking for any_errors_fatal 8240 1726773043.08403: checking for max_fail_percentage 8240 1726773043.08404: done checking for max_fail_percentage 8240 1726773043.08405: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.08406: done checking to see if all hosts have failed 8240 1726773043.08406: getting the remaining hosts for this loop 8240 1726773043.08407: done getting the remaining hosts for this loop 8240 1726773043.08410: getting the next task for host managed_node2 8240 1726773043.08415: done getting next task for host managed_node2 8240 1726773043.08419: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8240 1726773043.08423: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.08434: getting variables 8240 1726773043.08435: in VariableManager get_vars() 8240 1726773043.08469: Calling all_inventory to load vars for managed_node2 8240 1726773043.08472: Calling groups_inventory to load vars for managed_node2 8240 1726773043.08474: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.08483: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.08487: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.08490: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.08706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.08912: done with get_vars() 8240 1726773043.08922: done getting variables 8240 1726773043.08977: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.031) 0:00:21.733 **** 8240 1726773043.09011: entering _queue_task() for managed_node2/fail 8240 1726773043.09208: worker is 1 (out of 1 available) 8240 1726773043.09219: exiting _queue_task() for managed_node2/fail 8240 1726773043.09231: done queuing things up, now waiting for results queue to drain 8240 1726773043.09233: waiting for pending results... 9201 1726773043.10261: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 9201 1726773043.10416: in run() - task 0affffe7-6841-885f-bbcf-0000000001cb 9201 1726773043.10433: variable 'ansible_search_path' from source: unknown 9201 1726773043.10437: variable 'ansible_search_path' from source: unknown 9201 1726773043.10470: calling self._execute() 9201 1726773043.10547: variable 'ansible_host' from source: host vars for 'managed_node2' 9201 1726773043.10557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9201 1726773043.10565: variable 'omit' from source: magic vars 9201 1726773043.10989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9201 1726773043.11269: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9201 1726773043.11316: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9201 1726773043.11348: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9201 1726773043.11380: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9201 1726773043.11489: variable '__kernel_settings_register_verify_values' from source: set_fact 9201 1726773043.11516: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 9201 1726773043.11522: when evaluation is False, skipping this task 9201 1726773043.11526: _execute() done 9201 1726773043.11529: dumping result to json 9201 1726773043.11532: done dumping result, returning 9201 1726773043.11538: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-885f-bbcf-0000000001cb] 9201 1726773043.11543: sending task result for task 0affffe7-6841-885f-bbcf-0000000001cb 9201 1726773043.11572: done sending task result for task 0affffe7-6841-885f-bbcf-0000000001cb 9201 1726773043.11576: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773043.11927: no more pending results, returning what we have 8240 1726773043.11930: results queue empty 8240 1726773043.11931: checking for any_errors_fatal 8240 1726773043.11936: done checking for any_errors_fatal 8240 1726773043.11938: checking for max_fail_percentage 8240 1726773043.11939: done checking for max_fail_percentage 8240 1726773043.11940: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.11941: done checking to see if all hosts have failed 8240 1726773043.11941: getting the remaining hosts for this loop 8240 1726773043.11942: done getting the remaining hosts for this loop 8240 1726773043.11945: getting the next task for host managed_node2 8240 1726773043.11953: done getting next task for host managed_node2 8240 1726773043.11956: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8240 1726773043.11959: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.11971: getting variables 8240 1726773043.11972: in VariableManager get_vars() 8240 1726773043.12006: Calling all_inventory to load vars for managed_node2 8240 1726773043.12009: Calling groups_inventory to load vars for managed_node2 8240 1726773043.12011: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.12020: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.12023: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.12026: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.12195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.12381: done with get_vars() 8240 1726773043.12392: done getting variables 8240 1726773043.12446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.034) 0:00:21.768 **** 8240 1726773043.12475: entering _queue_task() for managed_node2/set_fact 8240 1726773043.12668: worker is 1 (out of 1 available) 8240 1726773043.12681: exiting _queue_task() for managed_node2/set_fact 8240 1726773043.12695: done queuing things up, now waiting for results queue to drain 8240 1726773043.12698: waiting for pending results... 9202 1726773043.12922: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 9202 1726773043.13061: in run() - task 0affffe7-6841-885f-bbcf-0000000000c3 9202 1726773043.13081: variable 'ansible_search_path' from source: unknown 9202 1726773043.13088: variable 'ansible_search_path' from source: unknown 9202 1726773043.13120: calling self._execute() 9202 1726773043.13205: variable 'ansible_host' from source: host vars for 'managed_node2' 9202 1726773043.13217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9202 1726773043.13227: variable 'omit' from source: magic vars 9202 1726773043.13313: variable 'omit' from source: magic vars 9202 1726773043.13358: variable 'omit' from source: magic vars 9202 1726773043.13393: variable 'omit' from source: magic vars 9202 1726773043.13432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9202 1726773043.13463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9202 1726773043.13490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9202 1726773043.13507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9202 1726773043.13520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9202 1726773043.13551: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9202 1726773043.13557: variable 'ansible_host' from source: host vars for 'managed_node2' 9202 1726773043.13561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9202 1726773043.13665: Set connection var ansible_pipelining to False 9202 1726773043.13677: Set connection var ansible_timeout to 10 9202 1726773043.13751: Set connection var ansible_module_compression to ZIP_DEFLATED 9202 1726773043.13757: Set connection var ansible_shell_type to sh 9202 1726773043.13763: Set connection var ansible_shell_executable to /bin/sh 9202 1726773043.13772: Set connection var ansible_connection to ssh 9202 1726773043.13793: variable 'ansible_shell_executable' from source: unknown 9202 1726773043.13798: variable 'ansible_connection' from source: unknown 9202 1726773043.13804: variable 'ansible_module_compression' from source: unknown 9202 1726773043.13808: variable 'ansible_shell_type' from source: unknown 9202 1726773043.13811: variable 'ansible_shell_executable' from source: unknown 9202 1726773043.13814: variable 'ansible_host' from source: host vars for 'managed_node2' 9202 1726773043.13818: variable 'ansible_pipelining' from source: unknown 9202 1726773043.13821: variable 'ansible_timeout' from source: unknown 9202 1726773043.13824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9202 1726773043.13956: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9202 1726773043.13973: variable 'omit' from source: magic vars 9202 1726773043.13980: starting attempt loop 9202 1726773043.13984: running the handler 9202 1726773043.13997: handler run complete 9202 1726773043.14008: attempt loop complete, returning result 9202 1726773043.14011: _execute() done 9202 1726773043.14014: dumping result to json 9202 1726773043.14017: done dumping result, returning 9202 1726773043.14023: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-0000000000c3] 9202 1726773043.14029: sending task result for task 0affffe7-6841-885f-bbcf-0000000000c3 9202 1726773043.14054: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000c3 9202 1726773043.14058: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8240 1726773043.14424: no more pending results, returning what we have 8240 1726773043.14427: results queue empty 8240 1726773043.14428: checking for any_errors_fatal 8240 1726773043.14434: done checking for any_errors_fatal 8240 1726773043.14434: checking for max_fail_percentage 8240 1726773043.14436: done checking for max_fail_percentage 8240 1726773043.14437: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.14437: done checking to see if all hosts have failed 8240 1726773043.14438: getting the remaining hosts for this loop 8240 1726773043.14439: done getting the remaining hosts for this loop 8240 1726773043.14443: getting the next task for host managed_node2 8240 1726773043.14448: done getting next task for host managed_node2 8240 1726773043.14452: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8240 1726773043.14455: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.14465: getting variables 8240 1726773043.14466: in VariableManager get_vars() 8240 1726773043.14498: Calling all_inventory to load vars for managed_node2 8240 1726773043.14501: Calling groups_inventory to load vars for managed_node2 8240 1726773043.14503: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.14511: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.14513: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.14515: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.14656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.14771: done with get_vars() 8240 1726773043.14779: done getting variables 8240 1726773043.14833: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.023) 0:00:21.792 **** 8240 1726773043.14865: entering _queue_task() for managed_node2/set_fact 8240 1726773043.15079: worker is 1 (out of 1 available) 8240 1726773043.15093: exiting _queue_task() for managed_node2/set_fact 8240 1726773043.15105: done queuing things up, now waiting for results queue to drain 8240 1726773043.15106: waiting for pending results... 9209 1726773043.15317: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 9209 1726773043.15453: in run() - task 0affffe7-6841-885f-bbcf-0000000000c4 9209 1726773043.15471: variable 'ansible_search_path' from source: unknown 9209 1726773043.15475: variable 'ansible_search_path' from source: unknown 9209 1726773043.15510: calling self._execute() 9209 1726773043.15595: variable 'ansible_host' from source: host vars for 'managed_node2' 9209 1726773043.15605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9209 1726773043.15613: variable 'omit' from source: magic vars 9209 1726773043.15712: variable 'omit' from source: magic vars 9209 1726773043.15756: variable 'omit' from source: magic vars 9209 1726773043.16112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9209 1726773043.16392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9209 1726773043.16433: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9209 1726773043.16464: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9209 1726773043.16498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9209 1726773043.16631: variable '__kernel_settings_register_profile' from source: set_fact 9209 1726773043.16646: variable '__kernel_settings_register_mode' from source: set_fact 9209 1726773043.16654: variable '__kernel_settings_register_apply' from source: set_fact 9209 1726773043.16704: variable 'omit' from source: magic vars 9209 1726773043.16730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9209 1726773043.16791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9209 1726773043.16810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9209 1726773043.16828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9209 1726773043.16839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9209 1726773043.16867: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9209 1726773043.16875: variable 'ansible_host' from source: host vars for 'managed_node2' 9209 1726773043.16879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9209 1726773043.16992: Set connection var ansible_pipelining to False 9209 1726773043.17000: Set connection var ansible_timeout to 10 9209 1726773043.17008: Set connection var ansible_module_compression to ZIP_DEFLATED 9209 1726773043.17012: Set connection var ansible_shell_type to sh 9209 1726773043.17017: Set connection var ansible_shell_executable to /bin/sh 9209 1726773043.17022: Set connection var ansible_connection to ssh 9209 1726773043.17043: variable 'ansible_shell_executable' from source: unknown 9209 1726773043.17048: variable 'ansible_connection' from source: unknown 9209 1726773043.17052: variable 'ansible_module_compression' from source: unknown 9209 1726773043.17055: variable 'ansible_shell_type' from source: unknown 9209 1726773043.17058: variable 'ansible_shell_executable' from source: unknown 9209 1726773043.17061: variable 'ansible_host' from source: host vars for 'managed_node2' 9209 1726773043.17064: variable 'ansible_pipelining' from source: unknown 9209 1726773043.17067: variable 'ansible_timeout' from source: unknown 9209 1726773043.17071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9209 1726773043.17168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9209 1726773043.17180: variable 'omit' from source: magic vars 9209 1726773043.17188: starting attempt loop 9209 1726773043.17191: running the handler 9209 1726773043.17202: handler run complete 9209 1726773043.17212: attempt loop complete, returning result 9209 1726773043.17215: _execute() done 9209 1726773043.17218: dumping result to json 9209 1726773043.17222: done dumping result, returning 9209 1726773043.17228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-885f-bbcf-0000000000c4] 9209 1726773043.17234: sending task result for task 0affffe7-6841-885f-bbcf-0000000000c4 9209 1726773043.17258: done sending task result for task 0affffe7-6841-885f-bbcf-0000000000c4 9209 1726773043.17261: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8240 1726773043.17516: no more pending results, returning what we have 8240 1726773043.17518: results queue empty 8240 1726773043.17520: checking for any_errors_fatal 8240 1726773043.17527: done checking for any_errors_fatal 8240 1726773043.17528: checking for max_fail_percentage 8240 1726773043.17530: done checking for max_fail_percentage 8240 1726773043.17530: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.17532: done checking to see if all hosts have failed 8240 1726773043.17532: getting the remaining hosts for this loop 8240 1726773043.17534: done getting the remaining hosts for this loop 8240 1726773043.17538: getting the next task for host managed_node2 8240 1726773043.17548: done getting next task for host managed_node2 8240 1726773043.17550: ^ task is: TASK: meta (role_complete) 8240 1726773043.17552: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.17563: getting variables 8240 1726773043.17565: in VariableManager get_vars() 8240 1726773043.17603: Calling all_inventory to load vars for managed_node2 8240 1726773043.17607: Calling groups_inventory to load vars for managed_node2 8240 1726773043.17609: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.17620: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.17624: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.17627: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.17807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.18009: done with get_vars() 8240 1726773043.18020: done getting variables 8240 1726773043.18097: done queuing things up, now waiting for results queue to drain 8240 1726773043.18099: results queue empty 8240 1726773043.18100: checking for any_errors_fatal 8240 1726773043.18103: done checking for any_errors_fatal 8240 1726773043.18103: checking for max_fail_percentage 8240 1726773043.18104: done checking for max_fail_percentage 8240 1726773043.18110: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.18110: done checking to see if all hosts have failed 8240 1726773043.18111: getting the remaining hosts for this loop 8240 1726773043.18111: done getting the remaining hosts for this loop 8240 1726773043.18113: getting the next task for host managed_node2 8240 1726773043.18116: done getting next task for host managed_node2 8240 1726773043.18118: ^ task is: TASK: Ensure kernel_settings_reboot_required is unset or undefined 8240 1726773043.18120: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.18122: getting variables 8240 1726773043.18123: in VariableManager get_vars() 8240 1726773043.18132: Calling all_inventory to load vars for managed_node2 8240 1726773043.18134: Calling groups_inventory to load vars for managed_node2 8240 1726773043.18136: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.18139: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.18141: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.18143: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.18300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.18471: done with get_vars() 8240 1726773043.18480: done getting variables 8240 1726773043.18559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure kernel_settings_reboot_required is unset or undefined] ************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:71 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.037) 0:00:21.829 **** 8240 1726773043.18592: entering _queue_task() for managed_node2/assert 8240 1726773043.18594: Creating lock for assert 8240 1726773043.18994: worker is 1 (out of 1 available) 8240 1726773043.19005: exiting _queue_task() for managed_node2/assert 8240 1726773043.19018: done queuing things up, now waiting for results queue to drain 8240 1726773043.19020: waiting for pending results... 9211 1726773043.19225: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined 9211 1726773043.19344: in run() - task 0affffe7-6841-885f-bbcf-000000000010 9211 1726773043.19363: variable 'ansible_search_path' from source: unknown 9211 1726773043.19398: calling self._execute() 9211 1726773043.19482: variable 'ansible_host' from source: host vars for 'managed_node2' 9211 1726773043.19494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9211 1726773043.19504: variable 'omit' from source: magic vars 9211 1726773043.19605: variable 'omit' from source: magic vars 9211 1726773043.19633: variable 'omit' from source: magic vars 9211 1726773043.19661: variable 'omit' from source: magic vars 9211 1726773043.19701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9211 1726773043.19733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9211 1726773043.19753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9211 1726773043.19770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9211 1726773043.19783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9211 1726773043.19816: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9211 1726773043.19823: variable 'ansible_host' from source: host vars for 'managed_node2' 9211 1726773043.19827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9211 1726773043.19926: Set connection var ansible_pipelining to False 9211 1726773043.19933: Set connection var ansible_timeout to 10 9211 1726773043.19942: Set connection var ansible_module_compression to ZIP_DEFLATED 9211 1726773043.19946: Set connection var ansible_shell_type to sh 9211 1726773043.19951: Set connection var ansible_shell_executable to /bin/sh 9211 1726773043.19956: Set connection var ansible_connection to ssh 9211 1726773043.19981: variable 'ansible_shell_executable' from source: unknown 9211 1726773043.19988: variable 'ansible_connection' from source: unknown 9211 1726773043.19992: variable 'ansible_module_compression' from source: unknown 9211 1726773043.19995: variable 'ansible_shell_type' from source: unknown 9211 1726773043.19999: variable 'ansible_shell_executable' from source: unknown 9211 1726773043.20002: variable 'ansible_host' from source: host vars for 'managed_node2' 9211 1726773043.20005: variable 'ansible_pipelining' from source: unknown 9211 1726773043.20008: variable 'ansible_timeout' from source: unknown 9211 1726773043.20014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9211 1726773043.20240: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9211 1726773043.20253: variable 'omit' from source: magic vars 9211 1726773043.20259: starting attempt loop 9211 1726773043.20263: running the handler 9211 1726773043.20599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9211 1726773043.23469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9211 1726773043.23547: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9211 1726773043.23583: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9211 1726773043.23619: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9211 1726773043.23645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9211 1726773043.23710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9211 1726773043.23737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9211 1726773043.23762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9211 1726773043.23802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9211 1726773043.23816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9211 1726773043.23933: variable 'kernel_settings_reboot_required' from source: set_fact 9211 1726773043.23952: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 9211 1726773043.23960: handler run complete 9211 1726773043.23982: attempt loop complete, returning result 9211 1726773043.23992: _execute() done 9211 1726773043.23998: dumping result to json 9211 1726773043.24002: done dumping result, returning 9211 1726773043.24009: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is unset or undefined [0affffe7-6841-885f-bbcf-000000000010] 9211 1726773043.24015: sending task result for task 0affffe7-6841-885f-bbcf-000000000010 9211 1726773043.24044: done sending task result for task 0affffe7-6841-885f-bbcf-000000000010 9211 1726773043.24048: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773043.24454: no more pending results, returning what we have 8240 1726773043.24458: results queue empty 8240 1726773043.24459: checking for any_errors_fatal 8240 1726773043.24462: done checking for any_errors_fatal 8240 1726773043.24463: checking for max_fail_percentage 8240 1726773043.24465: done checking for max_fail_percentage 8240 1726773043.24469: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.24470: done checking to see if all hosts have failed 8240 1726773043.24471: getting the remaining hosts for this loop 8240 1726773043.24472: done getting the remaining hosts for this loop 8240 1726773043.24477: getting the next task for host managed_node2 8240 1726773043.24482: done getting next task for host managed_node2 8240 1726773043.25590: ^ task is: TASK: Ensure role reported changed 8240 1726773043.25594: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.25597: getting variables 8240 1726773043.25599: in VariableManager get_vars() 8240 1726773043.25634: Calling all_inventory to load vars for managed_node2 8240 1726773043.25637: Calling groups_inventory to load vars for managed_node2 8240 1726773043.25639: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.25650: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.25659: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.25663: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.26895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.27078: done with get_vars() 8240 1726773043.27090: done getting variables 8240 1726773043.27138: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:75 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.085) 0:00:21.915 **** 8240 1726773043.27164: entering _queue_task() for managed_node2/assert 8240 1726773043.27369: worker is 1 (out of 1 available) 8240 1726773043.27381: exiting _queue_task() for managed_node2/assert 8240 1726773043.27393: done queuing things up, now waiting for results queue to drain 8240 1726773043.27394: waiting for pending results... 9213 1726773043.28701: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 9213 1726773043.28821: in run() - task 0affffe7-6841-885f-bbcf-000000000011 9213 1726773043.28838: variable 'ansible_search_path' from source: unknown 9213 1726773043.28872: calling self._execute() 9213 1726773043.28957: variable 'ansible_host' from source: host vars for 'managed_node2' 9213 1726773043.28971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9213 1726773043.28982: variable 'omit' from source: magic vars 9213 1726773043.29086: variable 'omit' from source: magic vars 9213 1726773043.29116: variable 'omit' from source: magic vars 9213 1726773043.29148: variable 'omit' from source: magic vars 9213 1726773043.29191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9213 1726773043.29224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9213 1726773043.29248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9213 1726773043.29269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9213 1726773043.29282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9213 1726773043.29313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9213 1726773043.29319: variable 'ansible_host' from source: host vars for 'managed_node2' 9213 1726773043.29323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9213 1726773043.29422: Set connection var ansible_pipelining to False 9213 1726773043.29431: Set connection var ansible_timeout to 10 9213 1726773043.29439: Set connection var ansible_module_compression to ZIP_DEFLATED 9213 1726773043.29443: Set connection var ansible_shell_type to sh 9213 1726773043.29448: Set connection var ansible_shell_executable to /bin/sh 9213 1726773043.29453: Set connection var ansible_connection to ssh 9213 1726773043.29476: variable 'ansible_shell_executable' from source: unknown 9213 1726773043.29481: variable 'ansible_connection' from source: unknown 9213 1726773043.29484: variable 'ansible_module_compression' from source: unknown 9213 1726773043.29489: variable 'ansible_shell_type' from source: unknown 9213 1726773043.29493: variable 'ansible_shell_executable' from source: unknown 9213 1726773043.29496: variable 'ansible_host' from source: host vars for 'managed_node2' 9213 1726773043.29500: variable 'ansible_pipelining' from source: unknown 9213 1726773043.29503: variable 'ansible_timeout' from source: unknown 9213 1726773043.29507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9213 1726773043.30754: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9213 1726773043.30775: variable 'omit' from source: magic vars 9213 1726773043.30782: starting attempt loop 9213 1726773043.30788: running the handler 9213 1726773043.31142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9213 1726773043.35700: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9213 1726773043.35771: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9213 1726773043.35811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9213 1726773043.35846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9213 1726773043.35874: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9213 1726773043.35942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9213 1726773043.35988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9213 1726773043.36014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9213 1726773043.36052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9213 1726773043.36070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9213 1726773043.36174: variable '__kernel_settings_changed' from source: set_fact 9213 1726773043.37301: Evaluated conditional (__kernel_settings_changed | d(false)): True 9213 1726773043.37312: handler run complete 9213 1726773043.37333: attempt loop complete, returning result 9213 1726773043.37337: _execute() done 9213 1726773043.37340: dumping result to json 9213 1726773043.37343: done dumping result, returning 9213 1726773043.37349: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [0affffe7-6841-885f-bbcf-000000000011] 9213 1726773043.37355: sending task result for task 0affffe7-6841-885f-bbcf-000000000011 9213 1726773043.37384: done sending task result for task 0affffe7-6841-885f-bbcf-000000000011 9213 1726773043.37389: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773043.37769: no more pending results, returning what we have 8240 1726773043.37772: results queue empty 8240 1726773043.37773: checking for any_errors_fatal 8240 1726773043.37778: done checking for any_errors_fatal 8240 1726773043.37779: checking for max_fail_percentage 8240 1726773043.37780: done checking for max_fail_percentage 8240 1726773043.37781: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.37781: done checking to see if all hosts have failed 8240 1726773043.37782: getting the remaining hosts for this loop 8240 1726773043.37783: done getting the remaining hosts for this loop 8240 1726773043.37788: getting the next task for host managed_node2 8240 1726773043.37794: done getting next task for host managed_node2 8240 1726773043.37796: ^ task is: TASK: Check sysfs after role runs 8240 1726773043.37798: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.37801: getting variables 8240 1726773043.37802: in VariableManager get_vars() 8240 1726773043.37834: Calling all_inventory to load vars for managed_node2 8240 1726773043.37837: Calling groups_inventory to load vars for managed_node2 8240 1726773043.37839: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.37849: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.37852: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.37860: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.38031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.38276: done with get_vars() 8240 1726773043.38287: done getting variables 8240 1726773043.38346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:79 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.112) 0:00:22.027 **** 8240 1726773043.38371: entering _queue_task() for managed_node2/command 8240 1726773043.38560: worker is 1 (out of 1 available) 8240 1726773043.38576: exiting _queue_task() for managed_node2/command 8240 1726773043.38590: done queuing things up, now waiting for results queue to drain 8240 1726773043.38592: waiting for pending results... 9216 1726773043.38806: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 9216 1726773043.38927: in run() - task 0affffe7-6841-885f-bbcf-000000000012 9216 1726773043.38945: variable 'ansible_search_path' from source: unknown 9216 1726773043.38980: calling self._execute() 9216 1726773043.39065: variable 'ansible_host' from source: host vars for 'managed_node2' 9216 1726773043.39077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9216 1726773043.39089: variable 'omit' from source: magic vars 9216 1726773043.39186: variable 'omit' from source: magic vars 9216 1726773043.39220: variable 'omit' from source: magic vars 9216 1726773043.39253: variable 'omit' from source: magic vars 9216 1726773043.39300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9216 1726773043.39336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9216 1726773043.39359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9216 1726773043.39380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9216 1726773043.39396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9216 1726773043.39428: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9216 1726773043.39435: variable 'ansible_host' from source: host vars for 'managed_node2' 9216 1726773043.39439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9216 1726773043.39545: Set connection var ansible_pipelining to False 9216 1726773043.39553: Set connection var ansible_timeout to 10 9216 1726773043.39561: Set connection var ansible_module_compression to ZIP_DEFLATED 9216 1726773043.39564: Set connection var ansible_shell_type to sh 9216 1726773043.39573: Set connection var ansible_shell_executable to /bin/sh 9216 1726773043.39578: Set connection var ansible_connection to ssh 9216 1726773043.39598: variable 'ansible_shell_executable' from source: unknown 9216 1726773043.39603: variable 'ansible_connection' from source: unknown 9216 1726773043.39607: variable 'ansible_module_compression' from source: unknown 9216 1726773043.39610: variable 'ansible_shell_type' from source: unknown 9216 1726773043.39612: variable 'ansible_shell_executable' from source: unknown 9216 1726773043.39615: variable 'ansible_host' from source: host vars for 'managed_node2' 9216 1726773043.39619: variable 'ansible_pipelining' from source: unknown 9216 1726773043.39622: variable 'ansible_timeout' from source: unknown 9216 1726773043.39626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9216 1726773043.41955: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9216 1726773043.41973: variable 'omit' from source: magic vars 9216 1726773043.41982: starting attempt loop 9216 1726773043.41987: running the handler 9216 1726773043.42003: _low_level_execute_command(): starting 9216 1726773043.42012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9216 1726773043.44710: stdout chunk (state=2): >>>/root <<< 9216 1726773043.44810: stderr chunk (state=3): >>><<< 9216 1726773043.44817: stdout chunk (state=3): >>><<< 9216 1726773043.44838: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9216 1726773043.44853: _low_level_execute_command(): starting 9216 1726773043.44859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139 `" && echo ansible-tmp-1726773043.448468-9216-148526549280139="` echo /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139 `" ) && sleep 0' 9216 1726773043.48011: stdout chunk (state=2): >>>ansible-tmp-1726773043.448468-9216-148526549280139=/root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139 <<< 9216 1726773043.48076: stderr chunk (state=3): >>><<< 9216 1726773043.48084: stdout chunk (state=3): >>><<< 9216 1726773043.48104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773043.448468-9216-148526549280139=/root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139 , stderr= 9216 1726773043.48135: variable 'ansible_module_compression' from source: unknown 9216 1726773043.48201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9216 1726773043.48238: variable 'ansible_facts' from source: unknown 9216 1726773043.48341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/AnsiballZ_command.py 9216 1726773043.49142: Sending initial data 9216 1726773043.49150: Sent initial data (153 bytes) 9216 1726773043.53051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmppxj_7n4_ /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/AnsiballZ_command.py <<< 9216 1726773043.54740: stderr chunk (state=3): >>><<< 9216 1726773043.54751: stdout chunk (state=3): >>><<< 9216 1726773043.54775: done transferring module to remote 9216 1726773043.54789: _low_level_execute_command(): starting 9216 1726773043.54796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/ /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/AnsiballZ_command.py && sleep 0' 9216 1726773043.58992: stderr chunk (state=2): >>><<< 9216 1726773043.59002: stdout chunk (state=2): >>><<< 9216 1726773043.59017: _low_level_execute_command() done: rc=0, stdout=, stderr= 9216 1726773043.59021: _low_level_execute_command(): starting 9216 1726773043.59026: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/AnsiballZ_command.py && sleep 0' 9216 1726773043.74825: stdout chunk (state=2): >>> {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:10:43.743666", "end": "2024-09-19 15:10:43.746820", "delta": "0:00:00.003154", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9216 1726773043.76016: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9216 1726773043.76029: stdout chunk (state=3): >>><<< 9216 1726773043.76038: stderr chunk (state=3): >>><<< 9216 1726773043.76050: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:10:43.743666", "end": "2024-09-19 15:10:43.746820", "delta": "0:00:00.003154", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9216 1726773043.76103: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9216 1726773043.76116: _low_level_execute_command(): starting 9216 1726773043.76123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773043.448468-9216-148526549280139/ > /dev/null 2>&1 && sleep 0' 9216 1726773043.78676: stderr chunk (state=2): >>><<< 9216 1726773043.78688: stdout chunk (state=2): >>><<< 9216 1726773043.78710: _low_level_execute_command() done: rc=0, stdout=, stderr= 9216 1726773043.78719: handler run complete 9216 1726773043.78741: Evaluated conditional (False): False 9216 1726773043.78751: attempt loop complete, returning result 9216 1726773043.78755: _execute() done 9216 1726773043.78758: dumping result to json 9216 1726773043.78763: done dumping result, returning 9216 1726773043.78772: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [0affffe7-6841-885f-bbcf-000000000012] 9216 1726773043.78778: sending task result for task 0affffe7-6841-885f-bbcf-000000000012 9216 1726773043.78820: done sending task result for task 0affffe7-6841-885f-bbcf-000000000012 9216 1726773043.78825: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003154", "end": "2024-09-19 15:10:43.746820", "rc": 0, "start": "2024-09-19 15:10:43.743666" } STDOUT: 65000 8240 1726773043.78974: no more pending results, returning what we have 8240 1726773043.78977: results queue empty 8240 1726773043.78978: checking for any_errors_fatal 8240 1726773043.78983: done checking for any_errors_fatal 8240 1726773043.78984: checking for max_fail_percentage 8240 1726773043.78987: done checking for max_fail_percentage 8240 1726773043.78987: checking to see if all hosts have failed and the running result is not ok 8240 1726773043.78988: done checking to see if all hosts have failed 8240 1726773043.78989: getting the remaining hosts for this loop 8240 1726773043.78990: done getting the remaining hosts for this loop 8240 1726773043.78993: getting the next task for host managed_node2 8240 1726773043.78998: done getting next task for host managed_node2 8240 1726773043.79001: ^ task is: TASK: Check sysctl after role runs 8240 1726773043.79002: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773043.79005: getting variables 8240 1726773043.79007: in VariableManager get_vars() 8240 1726773043.79039: Calling all_inventory to load vars for managed_node2 8240 1726773043.79042: Calling groups_inventory to load vars for managed_node2 8240 1726773043.79044: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773043.79053: Calling all_plugins_play to load vars for managed_node2 8240 1726773043.79056: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773043.79058: Calling groups_plugins_play to load vars for managed_node2 8240 1726773043.79222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773043.79406: done with get_vars() 8240 1726773043.79419: done getting variables 8240 1726773043.79479: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:83 Thursday 19 September 2024 15:10:43 -0400 (0:00:00.411) 0:00:22.438 **** 8240 1726773043.79509: entering _queue_task() for managed_node2/shell 8240 1726773043.79708: worker is 1 (out of 1 available) 8240 1726773043.79721: exiting _queue_task() for managed_node2/shell 8240 1726773043.79733: done queuing things up, now waiting for results queue to drain 8240 1726773043.79734: waiting for pending results... 9250 1726773043.80076: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9250 1726773043.80197: in run() - task 0affffe7-6841-885f-bbcf-000000000013 9250 1726773043.80215: variable 'ansible_search_path' from source: unknown 9250 1726773043.80247: calling self._execute() 9250 1726773043.80333: variable 'ansible_host' from source: host vars for 'managed_node2' 9250 1726773043.80344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9250 1726773043.80353: variable 'omit' from source: magic vars 9250 1726773043.80453: variable 'omit' from source: magic vars 9250 1726773043.80490: variable 'omit' from source: magic vars 9250 1726773043.80524: variable 'omit' from source: magic vars 9250 1726773043.80563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9250 1726773043.80603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9250 1726773043.80630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9250 1726773043.80649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9250 1726773043.80663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9250 1726773043.80694: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9250 1726773043.80700: variable 'ansible_host' from source: host vars for 'managed_node2' 9250 1726773043.80704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9250 1726773043.80809: Set connection var ansible_pipelining to False 9250 1726773043.80817: Set connection var ansible_timeout to 10 9250 1726773043.80830: Set connection var ansible_module_compression to ZIP_DEFLATED 9250 1726773043.80835: Set connection var ansible_shell_type to sh 9250 1726773043.80839: Set connection var ansible_shell_executable to /bin/sh 9250 1726773043.80846: Set connection var ansible_connection to ssh 9250 1726773043.80861: variable 'ansible_shell_executable' from source: unknown 9250 1726773043.80864: variable 'ansible_connection' from source: unknown 9250 1726773043.80868: variable 'ansible_module_compression' from source: unknown 9250 1726773043.80870: variable 'ansible_shell_type' from source: unknown 9250 1726773043.80872: variable 'ansible_shell_executable' from source: unknown 9250 1726773043.80874: variable 'ansible_host' from source: host vars for 'managed_node2' 9250 1726773043.80876: variable 'ansible_pipelining' from source: unknown 9250 1726773043.80877: variable 'ansible_timeout' from source: unknown 9250 1726773043.80879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9250 1726773043.81009: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9250 1726773043.81021: variable 'omit' from source: magic vars 9250 1726773043.81028: starting attempt loop 9250 1726773043.81031: running the handler 9250 1726773043.81040: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9250 1726773043.81057: _low_level_execute_command(): starting 9250 1726773043.81068: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9250 1726773043.83694: stdout chunk (state=2): >>>/root <<< 9250 1726773043.83707: stderr chunk (state=2): >>><<< 9250 1726773043.83722: stdout chunk (state=3): >>><<< 9250 1726773043.83740: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9250 1726773043.83758: _low_level_execute_command(): starting 9250 1726773043.83767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174 `" && echo ansible-tmp-1726773043.8374877-9250-261253860767174="` echo /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174 `" ) && sleep 0' 9250 1726773043.87493: stdout chunk (state=2): >>>ansible-tmp-1726773043.8374877-9250-261253860767174=/root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174 <<< 9250 1726773043.87794: stderr chunk (state=3): >>><<< 9250 1726773043.87805: stdout chunk (state=3): >>><<< 9250 1726773043.87824: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773043.8374877-9250-261253860767174=/root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174 , stderr= 9250 1726773043.87853: variable 'ansible_module_compression' from source: unknown 9250 1726773043.87913: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9250 1726773043.87953: variable 'ansible_facts' from source: unknown 9250 1726773043.88066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/AnsiballZ_command.py 9250 1726773043.88303: Sending initial data 9250 1726773043.88310: Sent initial data (154 bytes) 9250 1726773043.90750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpplhijliq /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/AnsiballZ_command.py <<< 9250 1726773043.91862: stderr chunk (state=3): >>><<< 9250 1726773043.91872: stdout chunk (state=3): >>><<< 9250 1726773043.91894: done transferring module to remote 9250 1726773043.91905: _low_level_execute_command(): starting 9250 1726773043.91912: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/ /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/AnsiballZ_command.py && sleep 0' 9250 1726773043.94331: stderr chunk (state=2): >>><<< 9250 1726773043.94341: stdout chunk (state=2): >>><<< 9250 1726773043.94357: _low_level_execute_command() done: rc=0, stdout=, stderr= 9250 1726773043.94361: _low_level_execute_command(): starting 9250 1726773043.94368: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/AnsiballZ_command.py && sleep 0' 9250 1726773044.10144: stdout chunk (state=2): >>> {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 15:10:44.093167", "end": "2024-09-19 15:10:44.100082", "delta": "0:00:00.006915", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9250 1726773044.11296: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9250 1726773044.11341: stderr chunk (state=3): >>><<< 9250 1726773044.11348: stdout chunk (state=3): >>><<< 9250 1726773044.11368: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 15:10:44.093167", "end": "2024-09-19 15:10:44.100082", "delta": "0:00:00.006915", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9250 1726773044.11465: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9250 1726773044.11478: _low_level_execute_command(): starting 9250 1726773044.11488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773043.8374877-9250-261253860767174/ > /dev/null 2>&1 && sleep 0' 9250 1726773044.13942: stderr chunk (state=2): >>><<< 9250 1726773044.13953: stdout chunk (state=2): >>><<< 9250 1726773044.13971: _low_level_execute_command() done: rc=0, stdout=, stderr= 9250 1726773044.13979: handler run complete 9250 1726773044.14002: Evaluated conditional (False): False 9250 1726773044.14014: attempt loop complete, returning result 9250 1726773044.14017: _execute() done 9250 1726773044.14020: dumping result to json 9250 1726773044.14026: done dumping result, returning 9250 1726773044.14032: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [0affffe7-6841-885f-bbcf-000000000013] 9250 1726773044.14038: sending task result for task 0affffe7-6841-885f-bbcf-000000000013 9250 1726773044.14072: done sending task result for task 0affffe7-6841-885f-bbcf-000000000013 9250 1726773044.14075: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.006915", "end": "2024-09-19 15:10:44.100082", "rc": 0, "start": "2024-09-19 15:10:44.093167" } STDOUT: 400000 8240 1726773044.14211: no more pending results, returning what we have 8240 1726773044.14214: results queue empty 8240 1726773044.14215: checking for any_errors_fatal 8240 1726773044.14222: done checking for any_errors_fatal 8240 1726773044.14222: checking for max_fail_percentage 8240 1726773044.14224: done checking for max_fail_percentage 8240 1726773044.14224: checking to see if all hosts have failed and the running result is not ok 8240 1726773044.14225: done checking to see if all hosts have failed 8240 1726773044.14226: getting the remaining hosts for this loop 8240 1726773044.14227: done getting the remaining hosts for this loop 8240 1726773044.14230: getting the next task for host managed_node2 8240 1726773044.14235: done getting next task for host managed_node2 8240 1726773044.14237: ^ task is: TASK: Check sysctl after role runs 8240 1726773044.14239: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773044.14242: getting variables 8240 1726773044.14243: in VariableManager get_vars() 8240 1726773044.14277: Calling all_inventory to load vars for managed_node2 8240 1726773044.14280: Calling groups_inventory to load vars for managed_node2 8240 1726773044.14282: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773044.14293: Calling all_plugins_play to load vars for managed_node2 8240 1726773044.14296: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773044.14298: Calling groups_plugins_play to load vars for managed_node2 8240 1726773044.14446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773044.14565: done with get_vars() 8240 1726773044.14576: done getting variables 8240 1726773044.14627: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl after role runs] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:89 Thursday 19 September 2024 15:10:44 -0400 (0:00:00.351) 0:00:22.790 **** 8240 1726773044.14658: entering _queue_task() for managed_node2/shell 8240 1726773044.14853: worker is 1 (out of 1 available) 8240 1726773044.14866: exiting _queue_task() for managed_node2/shell 8240 1726773044.14877: done queuing things up, now waiting for results queue to drain 8240 1726773044.14878: waiting for pending results... 9272 1726773044.15091: running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs 9272 1726773044.15218: in run() - task 0affffe7-6841-885f-bbcf-000000000014 9272 1726773044.15238: variable 'ansible_search_path' from source: unknown 9272 1726773044.15273: calling self._execute() 9272 1726773044.15356: variable 'ansible_host' from source: host vars for 'managed_node2' 9272 1726773044.15366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9272 1726773044.15375: variable 'omit' from source: magic vars 9272 1726773044.15471: variable 'omit' from source: magic vars 9272 1726773044.15503: variable 'omit' from source: magic vars 9272 1726773044.15533: variable 'omit' from source: magic vars 9272 1726773044.15574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9272 1726773044.15612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9272 1726773044.15631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9272 1726773044.15643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9272 1726773044.15652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9272 1726773044.15674: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9272 1726773044.15678: variable 'ansible_host' from source: host vars for 'managed_node2' 9272 1726773044.15681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9272 1726773044.15767: Set connection var ansible_pipelining to False 9272 1726773044.15775: Set connection var ansible_timeout to 10 9272 1726773044.15783: Set connection var ansible_module_compression to ZIP_DEFLATED 9272 1726773044.15787: Set connection var ansible_shell_type to sh 9272 1726773044.15793: Set connection var ansible_shell_executable to /bin/sh 9272 1726773044.15798: Set connection var ansible_connection to ssh 9272 1726773044.15813: variable 'ansible_shell_executable' from source: unknown 9272 1726773044.15817: variable 'ansible_connection' from source: unknown 9272 1726773044.15820: variable 'ansible_module_compression' from source: unknown 9272 1726773044.15824: variable 'ansible_shell_type' from source: unknown 9272 1726773044.15827: variable 'ansible_shell_executable' from source: unknown 9272 1726773044.15830: variable 'ansible_host' from source: host vars for 'managed_node2' 9272 1726773044.15835: variable 'ansible_pipelining' from source: unknown 9272 1726773044.15838: variable 'ansible_timeout' from source: unknown 9272 1726773044.15842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9272 1726773044.15969: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9272 1726773044.15983: variable 'omit' from source: magic vars 9272 1726773044.16001: starting attempt loop 9272 1726773044.16005: running the handler 9272 1726773044.16015: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9272 1726773044.16035: _low_level_execute_command(): starting 9272 1726773044.16043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9272 1726773044.18582: stdout chunk (state=2): >>>/root <<< 9272 1726773044.18715: stderr chunk (state=3): >>><<< 9272 1726773044.18722: stdout chunk (state=3): >>><<< 9272 1726773044.18747: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9272 1726773044.18765: _low_level_execute_command(): starting 9272 1726773044.18774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482 `" && echo ansible-tmp-1726773044.1875982-9272-96838363580482="` echo /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482 `" ) && sleep 0' 9272 1726773044.22303: stdout chunk (state=2): >>>ansible-tmp-1726773044.1875982-9272-96838363580482=/root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482 <<< 9272 1726773044.22433: stderr chunk (state=3): >>><<< 9272 1726773044.22440: stdout chunk (state=3): >>><<< 9272 1726773044.22455: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773044.1875982-9272-96838363580482=/root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482 , stderr= 9272 1726773044.22480: variable 'ansible_module_compression' from source: unknown 9272 1726773044.22530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9272 1726773044.22562: variable 'ansible_facts' from source: unknown 9272 1726773044.22639: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/AnsiballZ_command.py 9272 1726773044.22744: Sending initial data 9272 1726773044.22752: Sent initial data (153 bytes) 9272 1726773044.25442: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp0nz04wkv /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/AnsiballZ_command.py <<< 9272 1726773044.26760: stderr chunk (state=3): >>><<< 9272 1726773044.26771: stdout chunk (state=3): >>><<< 9272 1726773044.26796: done transferring module to remote 9272 1726773044.26809: _low_level_execute_command(): starting 9272 1726773044.26815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/ /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/AnsiballZ_command.py && sleep 0' 9272 1726773044.29249: stderr chunk (state=2): >>><<< 9272 1726773044.29258: stdout chunk (state=2): >>><<< 9272 1726773044.29273: _low_level_execute_command() done: rc=0, stdout=, stderr= 9272 1726773044.29278: _low_level_execute_command(): starting 9272 1726773044.29286: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/AnsiballZ_command.py && sleep 0' 9272 1726773044.45028: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 15:10:44.442964", "end": "2024-09-19 15:10:44.448813", "delta": "0:00:00.005849", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9272 1726773044.46234: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9272 1726773044.46246: stdout chunk (state=3): >>><<< 9272 1726773044.46258: stderr chunk (state=3): >>><<< 9272 1726773044.46275: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 15:10:44.442964", "end": "2024-09-19 15:10:44.448813", "delta": "0:00:00.005849", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9272 1726773044.46329: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9272 1726773044.46342: _low_level_execute_command(): starting 9272 1726773044.46348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773044.1875982-9272-96838363580482/ > /dev/null 2>&1 && sleep 0' 9272 1726773044.49011: stderr chunk (state=2): >>><<< 9272 1726773044.49023: stdout chunk (state=2): >>><<< 9272 1726773044.49043: _low_level_execute_command() done: rc=0, stdout=, stderr= 9272 1726773044.49052: handler run complete 9272 1726773044.49079: Evaluated conditional (False): False 9272 1726773044.49093: attempt loop complete, returning result 9272 1726773044.49097: _execute() done 9272 1726773044.49101: dumping result to json 9272 1726773044.49106: done dumping result, returning 9272 1726773044.49113: done running TaskExecutor() for managed_node2/TASK: Check sysctl after role runs [0affffe7-6841-885f-bbcf-000000000014] 9272 1726773044.49120: sending task result for task 0affffe7-6841-885f-bbcf-000000000014 9272 1726773044.49160: done sending task result for task 0affffe7-6841-885f-bbcf-000000000014 9272 1726773044.49164: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.005849", "end": "2024-09-19 15:10:44.448813", "rc": 0, "start": "2024-09-19 15:10:44.442964" } 8240 1726773044.49589: no more pending results, returning what we have 8240 1726773044.49592: results queue empty 8240 1726773044.49593: checking for any_errors_fatal 8240 1726773044.49600: done checking for any_errors_fatal 8240 1726773044.49601: checking for max_fail_percentage 8240 1726773044.49602: done checking for max_fail_percentage 8240 1726773044.49603: checking to see if all hosts have failed and the running result is not ok 8240 1726773044.49604: done checking to see if all hosts have failed 8240 1726773044.49604: getting the remaining hosts for this loop 8240 1726773044.49606: done getting the remaining hosts for this loop 8240 1726773044.49609: getting the next task for host managed_node2 8240 1726773044.49615: done getting next task for host managed_node2 8240 1726773044.49616: ^ task is: TASK: Reboot the machine - see if settings persist after reboot 8240 1726773044.49618: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773044.49622: getting variables 8240 1726773044.49623: in VariableManager get_vars() 8240 1726773044.49656: Calling all_inventory to load vars for managed_node2 8240 1726773044.49659: Calling groups_inventory to load vars for managed_node2 8240 1726773044.49661: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773044.49674: Calling all_plugins_play to load vars for managed_node2 8240 1726773044.49677: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773044.49680: Calling groups_plugins_play to load vars for managed_node2 8240 1726773044.49848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773044.50042: done with get_vars() 8240 1726773044.50054: done getting variables 8240 1726773044.50116: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reboot the machine - see if settings persist after reboot] *************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Thursday 19 September 2024 15:10:44 -0400 (0:00:00.354) 0:00:23.145 **** 8240 1726773044.50144: entering _queue_task() for managed_node2/reboot 8240 1726773044.50343: worker is 1 (out of 1 available) 8240 1726773044.50356: exiting _queue_task() for managed_node2/reboot 8240 1726773044.50370: done queuing things up, now waiting for results queue to drain 8240 1726773044.50371: waiting for pending results... 9295 1726773044.50575: running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot 9295 1726773044.50691: in run() - task 0affffe7-6841-885f-bbcf-000000000015 9295 1726773044.50709: variable 'ansible_search_path' from source: unknown 9295 1726773044.50741: calling self._execute() 9295 1726773044.50824: variable 'ansible_host' from source: host vars for 'managed_node2' 9295 1726773044.50833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9295 1726773044.50840: variable 'omit' from source: magic vars 9295 1726773044.50933: variable 'omit' from source: magic vars 9295 1726773044.50960: variable 'omit' from source: magic vars 9295 1726773044.50993: variable 'omit' from source: magic vars 9295 1726773044.51028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9295 1726773044.51057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9295 1726773044.51078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9295 1726773044.51095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9295 1726773044.51106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9295 1726773044.51131: variable 'inventory_hostname' from source: host vars for 'managed_node2' 9295 1726773044.51136: variable 'ansible_host' from source: host vars for 'managed_node2' 9295 1726773044.51140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9295 1726773044.51230: Set connection var ansible_pipelining to False 9295 1726773044.51238: Set connection var ansible_timeout to 10 9295 1726773044.51245: Set connection var ansible_module_compression to ZIP_DEFLATED 9295 1726773044.51247: Set connection var ansible_shell_type to sh 9295 1726773044.51252: Set connection var ansible_shell_executable to /bin/sh 9295 1726773044.51256: Set connection var ansible_connection to ssh 9295 1726773044.51275: variable 'ansible_shell_executable' from source: unknown 9295 1726773044.51280: variable 'ansible_connection' from source: unknown 9295 1726773044.51283: variable 'ansible_module_compression' from source: unknown 9295 1726773044.51288: variable 'ansible_shell_type' from source: unknown 9295 1726773044.51291: variable 'ansible_shell_executable' from source: unknown 9295 1726773044.51293: variable 'ansible_host' from source: host vars for 'managed_node2' 9295 1726773044.51296: variable 'ansible_pipelining' from source: unknown 9295 1726773044.51298: variable 'ansible_timeout' from source: unknown 9295 1726773044.51301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 9295 1726773044.51415: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9295 1726773044.51428: variable 'omit' from source: magic vars 9295 1726773044.51435: starting attempt loop 9295 1726773044.51439: running the handler 9295 1726773044.51447: reboot: running setup module to get distribution 9295 1726773044.51459: _low_level_execute_command(): starting 9295 1726773044.51471: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9295 1726773044.54380: stdout chunk (state=2): >>>/root <<< 9295 1726773044.54431: stderr chunk (state=3): >>><<< 9295 1726773044.54440: stdout chunk (state=3): >>><<< 9295 1726773044.54462: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 9295 1726773044.54481: _low_level_execute_command(): starting 9295 1726773044.54491: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407 `" && echo ansible-tmp-1726773044.5447402-9295-147479805162407="` echo /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407 `" ) && sleep 0' 9295 1726773044.57756: stdout chunk (state=2): >>>ansible-tmp-1726773044.5447402-9295-147479805162407=/root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407 <<< 9295 1726773044.57771: stderr chunk (state=2): >>><<< 9295 1726773044.57783: stdout chunk (state=3): >>><<< 9295 1726773044.57800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773044.5447402-9295-147479805162407=/root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407 , stderr= 9295 1726773044.57830: variable 'ansible_module_compression' from source: unknown 9295 1726773044.57892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9295 1726773044.57959: variable 'ansible_facts' from source: unknown 9295 1726773044.58172: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_setup.py 9295 1726773044.58951: Sending initial data 9295 1726773044.58958: Sent initial data (152 bytes) 9295 1726773044.62426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp1bencg9p /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_setup.py <<< 9295 1726773044.68515: stderr chunk (state=3): >>><<< 9295 1726773044.68528: stdout chunk (state=3): >>><<< 9295 1726773044.68554: done transferring module to remote 9295 1726773044.68567: _low_level_execute_command(): starting 9295 1726773044.68574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/ /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_setup.py && sleep 0' 9295 1726773044.71225: stderr chunk (state=2): >>><<< 9295 1726773044.71237: stdout chunk (state=2): >>><<< 9295 1726773044.71254: _low_level_execute_command() done: rc=0, stdout=, stderr= 9295 1726773044.71259: _low_level_execute_command(): starting 9295 1726773044.71264: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_setup.py && sleep 0' 9295 1726773044.99418: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-64", "ansible_nodename": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "df37f0c23b234636ab118236eb740b01", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "44", "epoch": "1726773044", "epoch_int": "1726773044", "date": "2024-09-19", "time": "15:10:44", "iso8601_micro": "2024-09-19T19:10:44.989902Z", "iso8601": "2024-09-19T19:10:44Z", "iso8601_basic": "20240919T151044989902", "iso8601_basic_short": "20240919T151044", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"LS_COLORS": "", "<<< 9295 1726773044.99457: stdout chunk (state=3): >>>SSH_CONNECTION": "10.31.14.7 38020 10.31.9.64 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 38020 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMo3DZDg5lt06Jrw0Qd9X74dNy/nBSXaUtMMS052uVgTKSXm0tKCx2zaAxNgM505uL6pZUEUOZeRR3Z/LNdlNSBT8r+2LnUpqcmdODxoyccRSKqmwLK1zJVzwSXZ+AjBD3x9gTlBYQayaOpqR1f05hNnHy2R3kxXoB1tNNpqpz3bAAAAFQCELYwNT97+ZXrdhwhMhoA7GWXL9QAAAIAtG2SRvcGWlL2z5hFtYYMsg8GRtVOEKlX108Ws20I7sI95Nm0WYvTIwFqYPINzLfCA+Ls/dLGPq2G5YUvm7QgMzmHhsK9TJhhd889W4OzyNzFL2GT6B86x7dZphalrTs/0syAVSP84E66QTj7TJU/HFsVowFD4iq3yKCBHZJJADAAAAIEAhWG94qCQeDFTxKLHPtQNkV9HI8hfIJXDM0pIL2n7yQ4TU9nWEOQtJjRFpp8k2NZ14U2EHZ7RelwhzfDZFiBK/2NQ+JoDjS0bFevNKG07tHLq+FXOxS5Gysh8BpFPLhRgxptusyg4njXv6abAem4QO5Gnikd8Ctf4orp3mf5Mo/w=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCpwL8CyExbkE87G4Z2ELO87C73MrB52pvN1/REWnLK8RyqlqY8GHCeODUP+lN9RROaKP+1mUq3R5P1vUyf0NUVZoeIitePd6dIY/HwffaeTzXLBp5sMcjPisFL9fVo1g9PkYZwmRgL4IDj39kSp4ttnuRynttW4g2Rs1HM67H6KkzKpM6kihrSGu78vUz+DUKL+CHSg8G9JAZwNYy4MIhlxZCBD6JVscaKv4UDDIKGaxur3MJxlFE5md8KVyzl2k+WSa/7XfMN2st+rOPN7S0/rxSQCqHrUjPlqorz9aGTRlP4RZAYaDtqE4c90/EHeAATAfCsJhdktOAD9qONVrn3xGVN5xCXGmMfYLZ45DqZVWZ4YX5ZyL6QgAb+85FH2gkWOHTqYMI6TEV7e1J7AWXpkqKygVZtILvPsrKCUYHLORO3bEdTWm5PcqLwhzTi7Myybh+twLfLdY2Yz1rPuCkWuI/Cz5RdyJxNeH9XbnhvF+7lVXY/7xuObH99wqWSFCE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAWpvYoviifCdAjKvSxQ8CBgYzbKEPHp2fMY65o4pwBevyZghxLsKaAsi+dFVlgZs0/UVAVgvbOXtdqsH0tvoHQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBNujYY/kuSC3n5Sb7T5pTC/SxbGKraWJ1B8z8Tcma+S", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 9295 1726773045.01012: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773045.01061: stderr chunk (state=3): >>><<< 9295 1726773045.01071: stdout chunk (state=3): >>><<< 9295 1726773045.01094: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-64", "ansible_nodename": "ip-10-31-9-64.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "df37f0c23b234636ab118236eb740b01", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "44", "epoch": "1726773044", "epoch_int": "1726773044", "date": "2024-09-19", "time": "15:10:44", "iso8601_micro": "2024-09-19T19:10:44.989902Z", "iso8601": "2024-09-19T19:10:44Z", "iso8601_basic": "20240919T151044989902", "iso8601_basic_short": "20240919T151044", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 38020 10.31.9.64 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "6", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 38020 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMo3DZDg5lt06Jrw0Qd9X74dNy/nBSXaUtMMS052uVgTKSXm0tKCx2zaAxNgM505uL6pZUEUOZeRR3Z/LNdlNSBT8r+2LnUpqcmdODxoyccRSKqmwLK1zJVzwSXZ+AjBD3x9gTlBYQayaOpqR1f05hNnHy2R3kxXoB1tNNpqpz3bAAAAFQCELYwNT97+ZXrdhwhMhoA7GWXL9QAAAIAtG2SRvcGWlL2z5hFtYYMsg8GRtVOEKlX108Ws20I7sI95Nm0WYvTIwFqYPINzLfCA+Ls/dLGPq2G5YUvm7QgMzmHhsK9TJhhd889W4OzyNzFL2GT6B86x7dZphalrTs/0syAVSP84E66QTj7TJU/HFsVowFD4iq3yKCBHZJJADAAAAIEAhWG94qCQeDFTxKLHPtQNkV9HI8hfIJXDM0pIL2n7yQ4TU9nWEOQtJjRFpp8k2NZ14U2EHZ7RelwhzfDZFiBK/2NQ+JoDjS0bFevNKG07tHLq+FXOxS5Gysh8BpFPLhRgxptusyg4njXv6abAem4QO5Gnikd8Ctf4orp3mf5Mo/w=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCpwL8CyExbkE87G4Z2ELO87C73MrB52pvN1/REWnLK8RyqlqY8GHCeODUP+lN9RROaKP+1mUq3R5P1vUyf0NUVZoeIitePd6dIY/HwffaeTzXLBp5sMcjPisFL9fVo1g9PkYZwmRgL4IDj39kSp4ttnuRynttW4g2Rs1HM67H6KkzKpM6kihrSGu78vUz+DUKL+CHSg8G9JAZwNYy4MIhlxZCBD6JVscaKv4UDDIKGaxur3MJxlFE5md8KVyzl2k+WSa/7XfMN2st+rOPN7S0/rxSQCqHrUjPlqorz9aGTRlP4RZAYaDtqE4c90/EHeAATAfCsJhdktOAD9qONVrn3xGVN5xCXGmMfYLZ45DqZVWZ4YX5ZyL6QgAb+85FH2gkWOHTqYMI6TEV7e1J7AWXpkqKygVZtILvPsrKCUYHLORO3bEdTWm5PcqLwhzTi7Myybh+twLfLdY2Yz1rPuCkWuI/Cz5RdyJxNeH9XbnhvF+7lVXY/7xuObH99wqWSFCE=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAWpvYoviifCdAjKvSxQ8CBgYzbKEPHp2fMY65o4pwBevyZghxLsKaAsi+dFVlgZs0/UVAVgvbOXtdqsH0tvoHQ=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBNujYY/kuSC3n5Sb7T5pTC/SxbGKraWJ1B8z8Tcma+S", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773045.01239: done with _execute_module (ansible.legacy.setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9295 1726773045.01259: reboot: distribution: {'name': 'centos', 'version': '8', 'family': 'redhat'} 9295 1726773045.01271: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9295 1726773045.01275: _low_level_execute_command(): starting 9295 1726773045.01280: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9295 1726773045.03898: stdout chunk (state=2): >>>e0a28e2c-5da3-41d4-b106-b559502b382e <<< 9295 1726773045.03993: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773045.04037: stderr chunk (state=3): >>><<< 9295 1726773045.04044: stdout chunk (state=3): >>><<< 9295 1726773045.04059: _low_level_execute_command() done: rc=0, stdout=e0a28e2c-5da3-41d4-b106-b559502b382e , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773045.04065: reboot: last boot time: e0a28e2c-5da3-41d4-b106-b559502b382e 9295 1726773045.04082: reboot: connect_timeout connection option has not been set 9295 1726773045.04091: reboot: running find module looking in ['/sbin', '/bin', '/usr/sbin', '/usr/bin', '/usr/local/sbin'] to get path for "shutdown" 9295 1726773045.04109: variable 'ansible_module_compression' from source: unknown 9295 1726773045.04142: ANSIBALLZ: Using generic lock for ansible.legacy.find 9295 1726773045.04147: ANSIBALLZ: Acquiring lock 9295 1726773045.04150: ANSIBALLZ: Lock acquired: 139787572477392 9295 1726773045.04154: ANSIBALLZ: Creating module 9295 1726773045.13277: ANSIBALLZ: Writing module into payload 9295 1726773045.13416: ANSIBALLZ: Writing module 9295 1726773045.13437: ANSIBALLZ: Renaming module 9295 1726773045.13444: ANSIBALLZ: Done creating module 9295 1726773045.13456: variable 'ansible_facts' from source: unknown 9295 1726773045.13541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_find.py 9295 1726773045.13662: Sending initial data 9295 1726773045.13673: Sent initial data (151 bytes) 9295 1726773045.16528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp6m4kv37w /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_find.py <<< 9295 1726773045.18324: stderr chunk (state=3): >>><<< 9295 1726773045.18334: stdout chunk (state=3): >>><<< 9295 1726773045.18355: done transferring module to remote 9295 1726773045.18367: _low_level_execute_command(): starting 9295 1726773045.18373: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/ /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_find.py && sleep 0' 9295 1726773045.20973: stderr chunk (state=2): >>><<< 9295 1726773045.20993: stdout chunk (state=3): >>><<< 9295 1726773045.21007: _low_level_execute_command() done: rc=0, stdout=, stderr= 9295 1726773045.21012: _low_level_execute_command(): starting 9295 1726773045.21017: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/AnsiballZ_find.py && sleep 0' 9295 1726773045.55221: stdout chunk (state=2): >>> {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773045.364153, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773045.364153, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "All paths examined", "matched": 2, "examined": 2640, "skipped_paths": {}, "invocation": {"module_args": {"paths": ["/sbin", "/bin", "/usr/sbin", "/usr/bin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "read_whole_file": false, "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "exact_mode": true, "excludes": null, "contains": null, "age": null, "size": null, "depth": null, "mode": null}}} <<< 9295 1726773045.56892: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773045.56906: stdout chunk (state=3): >>><<< 9295 1726773045.56918: stderr chunk (state=3): >>><<< 9295 1726773045.56932: _low_level_execute_command() done: rc=0, stdout= {"files": [{"path": "/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773045.364153, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}, {"path": "/usr/sbin/shutdown", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 16, "inode": 720002, "dev": 51713, "nlink": 1, "atime": 1726773045.364153, "mtime": 1712922012.0, "ctime": 1716968724.286, "gr_name": "root", "pw_name": "root", "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false}], "changed": false, "msg": "All paths examined", "matched": 2, "examined": 2640, "skipped_paths": {}, "invocation": {"module_args": {"paths": ["/sbin", "/bin", "/usr/sbin", "/usr/bin", "/usr/local/sbin"], "patterns": ["shutdown"], "file_type": "any", "read_whole_file": false, "age_stamp": "mtime", "recurse": false, "hidden": false, "follow": false, "get_checksum": false, "use_regex": false, "exact_mode": true, "excludes": null, "contains": null, "age": null, "size": null, "depth": null, "mode": null}}} , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773045.57008: done with _execute_module (ansible.legacy.find, {'paths': ['/sbin', '/bin', '/usr/sbin', '/usr/bin', '/usr/local/sbin'], 'patterns': ['shutdown'], 'file_type': 'any', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.find', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 9295 1726773045.57023: reboot: rebooting server with command '/sbin/shutdown -r 0 "Reboot initiated by Ansible"' 9295 1726773045.57027: _low_level_execute_command(): starting 9295 1726773045.57031: _low_level_execute_command(): executing: /bin/sh -c '/sbin/shutdown -r 0 "Reboot initiated by Ansible" && sleep 0' 9295 1726773045.61470: stdout chunk (state=2): >>>Shutdown scheduled for Thu 2024-09-19 15:10:45 EDT, use 'shutdown -c' to cancel. <<< 9295 1726773045.61786: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773045.61829: stderr chunk (state=3): >>><<< 9295 1726773045.61836: stdout chunk (state=3): >>><<< 9295 1726773045.61852: _low_level_execute_command() done: rc=0, stdout=Shutdown scheduled for Thu 2024-09-19 15:10:45 EDT, use 'shutdown -c' to cancel. , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773045.61870: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9295 1726773045.61875: _low_level_execute_command(): starting 9295 1726773045.61881: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9295 1726773045.65931: stderr chunk (state=2): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773045.65946: stdout chunk (state=2): >>><<< 9295 1726773045.65958: stderr chunk (state=3): >>><<< 9295 1726773045.66476: reboot: last boot time check fail 'Failed to connect to the host via ssh: Shared connection to 10.31.9.64 closed.', retrying in 1.4430 seconds... 9295 1726773047.10794: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9295 1726773047.10807: _low_level_execute_command(): starting 9295 1726773047.10813: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9295 1726773057.12275: stderr chunk (state=2): >>>ssh: connect to host 10.31.9.64 port 22: Connection timed out <<< 9295 1726773057.12322: stderr chunk (state=3): >>><<< 9295 1726773057.12329: stdout chunk (state=3): >>><<< 9295 1726773057.12796: reboot: last boot time check fail 'Failed to connect to the host via ssh: ssh: connect to host 10.31.9.64 port 22: Connection timed out', retrying in 2.5870 seconds... 9295 1726773059.71517: reboot: getting boot time with command: 'cat /proc/sys/kernel/random/boot_id' 9295 1726773059.71526: _low_level_execute_command(): starting 9295 1726773059.71532: _low_level_execute_command(): executing: /bin/sh -c 'cat /proc/sys/kernel/random/boot_id && sleep 0' 9295 1726773067.71001: stdout chunk (state=2): >>>bd5fc49a-8ba5-4bde-b6b9-02c1b32fec39 <<< 9295 1726773067.71367: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773067.71424: stderr chunk (state=3): >>><<< 9295 1726773067.71431: stdout chunk (state=3): >>><<< 9295 1726773067.71446: _low_level_execute_command() done: rc=0, stdout=bd5fc49a-8ba5-4bde-b6b9-02c1b32fec39 , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773067.71453: reboot: last boot time: bd5fc49a-8ba5-4bde-b6b9-02c1b32fec39 9295 1726773067.71458: reboot: last boot time check success 9295 1726773067.71468: reboot: attempting post-reboot test command 'tuned-adm active' 9295 1726773067.71474: _low_level_execute_command(): starting 9295 1726773067.71479: _low_level_execute_command(): executing: /bin/sh -c 'tuned-adm active && sleep 0' 9295 1726773067.85893: stdout chunk (state=2): >>>Current active profile: virtual-guest kernel_settings <<< 9295 1726773067.86986: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 9295 1726773067.87034: stderr chunk (state=3): >>><<< 9295 1726773067.87041: stdout chunk (state=3): >>><<< 9295 1726773067.87058: _low_level_execute_command() done: rc=0, stdout=Current active profile: virtual-guest kernel_settings , stderr=Shared connection to 10.31.9.64 closed. 9295 1726773067.87065: reboot: post-reboot test command success 9295 1726773067.87076: _low_level_execute_command(): starting 9295 1726773067.87082: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773044.5447402-9295-147479805162407/ > /dev/null 2>&1 && sleep 0' 9295 1726773067.89664: stderr chunk (state=2): >>><<< 9295 1726773067.89676: stdout chunk (state=2): >>><<< 9295 1726773067.89693: _low_level_execute_command() done: rc=0, stdout=, stderr= 9295 1726773067.89697: handler run complete 9295 1726773067.89712: attempt loop complete, returning result 9295 1726773067.89716: _execute() done 9295 1726773067.89719: dumping result to json 9295 1726773067.89723: done dumping result, returning 9295 1726773067.89730: done running TaskExecutor() for managed_node2/TASK: Reboot the machine - see if settings persist after reboot [0affffe7-6841-885f-bbcf-000000000015] 9295 1726773067.89735: sending task result for task 0affffe7-6841-885f-bbcf-000000000015 9295 1726773067.89761: done sending task result for task 0affffe7-6841-885f-bbcf-000000000015 9295 1726773067.89765: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "elapsed": 22, "rebooted": true } 8240 1726773067.89866: no more pending results, returning what we have 8240 1726773067.89869: results queue empty 8240 1726773067.89869: checking for any_errors_fatal 8240 1726773067.89874: done checking for any_errors_fatal 8240 1726773067.89875: checking for max_fail_percentage 8240 1726773067.89876: done checking for max_fail_percentage 8240 1726773067.89877: checking to see if all hosts have failed and the running result is not ok 8240 1726773067.89877: done checking to see if all hosts have failed 8240 1726773067.89878: getting the remaining hosts for this loop 8240 1726773067.89879: done getting the remaining hosts for this loop 8240 1726773067.89882: getting the next task for host managed_node2 8240 1726773067.89888: done getting next task for host managed_node2 8240 1726773067.89889: ^ task is: TASK: Check sysctl after reboot 8240 1726773067.89891: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773067.89894: getting variables 8240 1726773067.89895: in VariableManager get_vars() 8240 1726773067.89923: Calling all_inventory to load vars for managed_node2 8240 1726773067.89926: Calling groups_inventory to load vars for managed_node2 8240 1726773067.89928: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773067.89937: Calling all_plugins_play to load vars for managed_node2 8240 1726773067.89939: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773067.89942: Calling groups_plugins_play to load vars for managed_node2 8240 1726773067.90094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773067.90237: done with get_vars() 8240 1726773067.90244: done getting variables 8240 1726773067.90284: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:99 Thursday 19 September 2024 15:11:07 -0400 (0:00:23.401) 0:00:46.546 **** 8240 1726773067.90307: entering _queue_task() for managed_node2/shell 8240 1726773067.90465: worker is 1 (out of 1 available) 8240 1726773067.90480: exiting _queue_task() for managed_node2/shell 8240 1726773067.90495: done queuing things up, now waiting for results queue to drain 8240 1726773067.90497: waiting for pending results... 10309 1726773067.90617: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10309 1726773067.90717: in run() - task 0affffe7-6841-885f-bbcf-000000000016 10309 1726773067.90736: variable 'ansible_search_path' from source: unknown 10309 1726773067.90764: calling self._execute() 10309 1726773067.90832: variable 'ansible_host' from source: host vars for 'managed_node2' 10309 1726773067.90840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10309 1726773067.90849: variable 'omit' from source: magic vars 10309 1726773067.90924: variable 'omit' from source: magic vars 10309 1726773067.90949: variable 'omit' from source: magic vars 10309 1726773067.90976: variable 'omit' from source: magic vars 10309 1726773067.91011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10309 1726773067.91045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10309 1726773067.91069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10309 1726773067.91089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10309 1726773067.91102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10309 1726773067.91126: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10309 1726773067.91132: variable 'ansible_host' from source: host vars for 'managed_node2' 10309 1726773067.91136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10309 1726773067.91211: Set connection var ansible_pipelining to False 10309 1726773067.91218: Set connection var ansible_timeout to 10 10309 1726773067.91226: Set connection var ansible_module_compression to ZIP_DEFLATED 10309 1726773067.91229: Set connection var ansible_shell_type to sh 10309 1726773067.91234: Set connection var ansible_shell_executable to /bin/sh 10309 1726773067.91239: Set connection var ansible_connection to ssh 10309 1726773067.91254: variable 'ansible_shell_executable' from source: unknown 10309 1726773067.91258: variable 'ansible_connection' from source: unknown 10309 1726773067.91261: variable 'ansible_module_compression' from source: unknown 10309 1726773067.91265: variable 'ansible_shell_type' from source: unknown 10309 1726773067.91268: variable 'ansible_shell_executable' from source: unknown 10309 1726773067.91274: variable 'ansible_host' from source: host vars for 'managed_node2' 10309 1726773067.91278: variable 'ansible_pipelining' from source: unknown 10309 1726773067.91281: variable 'ansible_timeout' from source: unknown 10309 1726773067.91287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10309 1726773067.91386: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10309 1726773067.91399: variable 'omit' from source: magic vars 10309 1726773067.91405: starting attempt loop 10309 1726773067.91409: running the handler 10309 1726773067.91418: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10309 1726773067.91434: _low_level_execute_command(): starting 10309 1726773067.91442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10309 1726773067.93838: stdout chunk (state=2): >>>/root <<< 10309 1726773067.93928: stderr chunk (state=3): >>><<< 10309 1726773067.93936: stdout chunk (state=3): >>><<< 10309 1726773067.93956: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10309 1726773067.93976: _low_level_execute_command(): starting 10309 1726773067.93983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673 `" && echo ansible-tmp-1726773067.9396439-10309-224689037144673="` echo /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673 `" ) && sleep 0' 10309 1726773067.96521: stdout chunk (state=2): >>>ansible-tmp-1726773067.9396439-10309-224689037144673=/root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673 <<< 10309 1726773067.96650: stderr chunk (state=3): >>><<< 10309 1726773067.96657: stdout chunk (state=3): >>><<< 10309 1726773067.96674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773067.9396439-10309-224689037144673=/root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673 , stderr= 10309 1726773067.96702: variable 'ansible_module_compression' from source: unknown 10309 1726773067.96747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10309 1726773067.96780: variable 'ansible_facts' from source: unknown 10309 1726773067.96855: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/AnsiballZ_command.py 10309 1726773067.96961: Sending initial data 10309 1726773067.96968: Sent initial data (155 bytes) 10309 1726773067.99720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpdipy7ckr /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/AnsiballZ_command.py <<< 10309 1726773068.00807: stderr chunk (state=3): >>><<< 10309 1726773068.00815: stdout chunk (state=3): >>><<< 10309 1726773068.00835: done transferring module to remote 10309 1726773068.00845: _low_level_execute_command(): starting 10309 1726773068.00850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/ /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/AnsiballZ_command.py && sleep 0' 10309 1726773068.03293: stderr chunk (state=2): >>><<< 10309 1726773068.03304: stdout chunk (state=2): >>><<< 10309 1726773068.03320: _low_level_execute_command() done: rc=0, stdout=, stderr= 10309 1726773068.03324: _low_level_execute_command(): starting 10309 1726773068.03329: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/AnsiballZ_command.py && sleep 0' 10309 1726773068.21297: stdout chunk (state=2): >>> {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 15:11:07.544242", "end": "2024-09-19 15:11:07.553604", "delta": "0:00:00.009362", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10309 1726773068.22440: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10309 1726773068.22489: stderr chunk (state=3): >>><<< 10309 1726773068.22496: stdout chunk (state=3): >>><<< 10309 1726773068.22514: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "400000", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "start": "2024-09-19 15:11:07.544242", "end": "2024-09-19 15:11:07.553604", "delta": "0:00:00.009362", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10309 1726773068.22554: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10309 1726773068.22565: _low_level_execute_command(): starting 10309 1726773068.22573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773067.9396439-10309-224689037144673/ > /dev/null 2>&1 && sleep 0' 10309 1726773068.24962: stderr chunk (state=2): >>><<< 10309 1726773068.24974: stdout chunk (state=2): >>><<< 10309 1726773068.24990: _low_level_execute_command() done: rc=0, stdout=, stderr= 10309 1726773068.24999: handler run complete 10309 1726773068.25016: Evaluated conditional (False): False 10309 1726773068.25025: attempt loop complete, returning result 10309 1726773068.25028: _execute() done 10309 1726773068.25031: dumping result to json 10309 1726773068.25036: done dumping result, returning 10309 1726773068.25043: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [0affffe7-6841-885f-bbcf-000000000016] 10309 1726773068.25049: sending task result for task 0affffe7-6841-885f-bbcf-000000000016 10309 1726773068.25081: done sending task result for task 0affffe7-6841-885f-bbcf-000000000016 10309 1726773068.25086: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -x 400000", "delta": "0:00:00.009362", "end": "2024-09-19 15:11:07.553604", "rc": 0, "start": "2024-09-19 15:11:07.544242" } STDOUT: 400000 8240 1726773068.25244: no more pending results, returning what we have 8240 1726773068.25248: results queue empty 8240 1726773068.25248: checking for any_errors_fatal 8240 1726773068.25254: done checking for any_errors_fatal 8240 1726773068.25254: checking for max_fail_percentage 8240 1726773068.25255: done checking for max_fail_percentage 8240 1726773068.25256: checking to see if all hosts have failed and the running result is not ok 8240 1726773068.25257: done checking to see if all hosts have failed 8240 1726773068.25257: getting the remaining hosts for this loop 8240 1726773068.25259: done getting the remaining hosts for this loop 8240 1726773068.25263: getting the next task for host managed_node2 8240 1726773068.25268: done getting next task for host managed_node2 8240 1726773068.25270: ^ task is: TASK: Check sysfs after reboot 8240 1726773068.25271: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773068.25275: getting variables 8240 1726773068.25278: in VariableManager get_vars() 8240 1726773068.25312: Calling all_inventory to load vars for managed_node2 8240 1726773068.25315: Calling groups_inventory to load vars for managed_node2 8240 1726773068.25316: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773068.25325: Calling all_plugins_play to load vars for managed_node2 8240 1726773068.25327: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773068.25329: Calling groups_plugins_play to load vars for managed_node2 8240 1726773068.25437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773068.25549: done with get_vars() 8240 1726773068.25558: done getting variables 8240 1726773068.25603: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysfs after reboot] ************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:105 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.353) 0:00:46.900 **** 8240 1726773068.25626: entering _queue_task() for managed_node2/command 8240 1726773068.25784: worker is 1 (out of 1 available) 8240 1726773068.25800: exiting _queue_task() for managed_node2/command 8240 1726773068.25812: done queuing things up, now waiting for results queue to drain 8240 1726773068.25813: waiting for pending results... 10328 1726773068.25934: running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot 10328 1726773068.26027: in run() - task 0affffe7-6841-885f-bbcf-000000000017 10328 1726773068.26042: variable 'ansible_search_path' from source: unknown 10328 1726773068.26074: calling self._execute() 10328 1726773068.26143: variable 'ansible_host' from source: host vars for 'managed_node2' 10328 1726773068.26150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10328 1726773068.26157: variable 'omit' from source: magic vars 10328 1726773068.26231: variable 'omit' from source: magic vars 10328 1726773068.26255: variable 'omit' from source: magic vars 10328 1726773068.26280: variable 'omit' from source: magic vars 10328 1726773068.26316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10328 1726773068.26342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10328 1726773068.26361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10328 1726773068.26379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10328 1726773068.26393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10328 1726773068.26417: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10328 1726773068.26422: variable 'ansible_host' from source: host vars for 'managed_node2' 10328 1726773068.26427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10328 1726773068.26502: Set connection var ansible_pipelining to False 10328 1726773068.26509: Set connection var ansible_timeout to 10 10328 1726773068.26517: Set connection var ansible_module_compression to ZIP_DEFLATED 10328 1726773068.26520: Set connection var ansible_shell_type to sh 10328 1726773068.26525: Set connection var ansible_shell_executable to /bin/sh 10328 1726773068.26530: Set connection var ansible_connection to ssh 10328 1726773068.26545: variable 'ansible_shell_executable' from source: unknown 10328 1726773068.26549: variable 'ansible_connection' from source: unknown 10328 1726773068.26552: variable 'ansible_module_compression' from source: unknown 10328 1726773068.26556: variable 'ansible_shell_type' from source: unknown 10328 1726773068.26560: variable 'ansible_shell_executable' from source: unknown 10328 1726773068.26563: variable 'ansible_host' from source: host vars for 'managed_node2' 10328 1726773068.26568: variable 'ansible_pipelining' from source: unknown 10328 1726773068.26573: variable 'ansible_timeout' from source: unknown 10328 1726773068.26578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10328 1726773068.26671: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10328 1726773068.26681: variable 'omit' from source: magic vars 10328 1726773068.26688: starting attempt loop 10328 1726773068.26691: running the handler 10328 1726773068.26702: _low_level_execute_command(): starting 10328 1726773068.26708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10328 1726773068.29054: stdout chunk (state=2): >>>/root <<< 10328 1726773068.29175: stderr chunk (state=3): >>><<< 10328 1726773068.29190: stdout chunk (state=3): >>><<< 10328 1726773068.29204: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10328 1726773068.29218: _low_level_execute_command(): starting 10328 1726773068.29223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547 `" && echo ansible-tmp-1726773068.2921185-10328-15162097597547="` echo /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547 `" ) && sleep 0' 10328 1726773068.31881: stdout chunk (state=2): >>>ansible-tmp-1726773068.2921185-10328-15162097597547=/root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547 <<< 10328 1726773068.32010: stderr chunk (state=3): >>><<< 10328 1726773068.32017: stdout chunk (state=3): >>><<< 10328 1726773068.32031: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.2921185-10328-15162097597547=/root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547 , stderr= 10328 1726773068.32056: variable 'ansible_module_compression' from source: unknown 10328 1726773068.32104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10328 1726773068.32134: variable 'ansible_facts' from source: unknown 10328 1726773068.32213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/AnsiballZ_command.py 10328 1726773068.32312: Sending initial data 10328 1726773068.32320: Sent initial data (154 bytes) 10328 1726773068.34793: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpx6piv1gf /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/AnsiballZ_command.py <<< 10328 1726773068.35893: stderr chunk (state=3): >>><<< 10328 1726773068.35900: stdout chunk (state=3): >>><<< 10328 1726773068.35918: done transferring module to remote 10328 1726773068.35928: _low_level_execute_command(): starting 10328 1726773068.35934: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/ /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/AnsiballZ_command.py && sleep 0' 10328 1726773068.38259: stderr chunk (state=2): >>><<< 10328 1726773068.38266: stdout chunk (state=2): >>><<< 10328 1726773068.38279: _low_level_execute_command() done: rc=0, stdout=, stderr= 10328 1726773068.38283: _low_level_execute_command(): starting 10328 1726773068.38290: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/AnsiballZ_command.py && sleep 0' 10328 1726773068.53987: stdout chunk (state=2): >>> {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:07.874456", "end": "2024-09-19 15:11:07.877625", "delta": "0:00:00.003169", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10328 1726773068.54895: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10328 1726773068.54947: stderr chunk (state=3): >>><<< 10328 1726773068.54954: stdout chunk (state=3): >>><<< 10328 1726773068.54971: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65000", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "65000", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:07.874456", "end": "2024-09-19 15:11:07.877625", "delta": "0:00:00.003169", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 65000 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10328 1726773068.55069: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 65000 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10328 1726773068.55081: _low_level_execute_command(): starting 10328 1726773068.55088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.2921185-10328-15162097597547/ > /dev/null 2>&1 && sleep 0' 10328 1726773068.57539: stderr chunk (state=2): >>><<< 10328 1726773068.57553: stdout chunk (state=2): >>><<< 10328 1726773068.57569: _low_level_execute_command() done: rc=0, stdout=, stderr= 10328 1726773068.57579: handler run complete 10328 1726773068.57598: Evaluated conditional (False): False 10328 1726773068.57608: attempt loop complete, returning result 10328 1726773068.57612: _execute() done 10328 1726773068.57615: dumping result to json 10328 1726773068.57621: done dumping result, returning 10328 1726773068.57626: done running TaskExecutor() for managed_node2/TASK: Check sysfs after reboot [0affffe7-6841-885f-bbcf-000000000017] 10328 1726773068.57631: sending task result for task 0affffe7-6841-885f-bbcf-000000000017 10328 1726773068.57659: done sending task result for task 0affffe7-6841-885f-bbcf-000000000017 10328 1726773068.57661: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "65000", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003169", "end": "2024-09-19 15:11:07.877625", "rc": 0, "start": "2024-09-19 15:11:07.874456" } STDOUT: 65000 8240 1726773068.57956: no more pending results, returning what we have 8240 1726773068.57958: results queue empty 8240 1726773068.57959: checking for any_errors_fatal 8240 1726773068.57964: done checking for any_errors_fatal 8240 1726773068.57965: checking for max_fail_percentage 8240 1726773068.57966: done checking for max_fail_percentage 8240 1726773068.57966: checking to see if all hosts have failed and the running result is not ok 8240 1726773068.57967: done checking to see if all hosts have failed 8240 1726773068.57967: getting the remaining hosts for this loop 8240 1726773068.57968: done getting the remaining hosts for this loop 8240 1726773068.57972: getting the next task for host managed_node2 8240 1726773068.57979: done getting next task for host managed_node2 8240 1726773068.57980: ^ task is: TASK: Check sysctl after reboot 8240 1726773068.57981: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773068.57985: getting variables 8240 1726773068.57987: in VariableManager get_vars() 8240 1726773068.58016: Calling all_inventory to load vars for managed_node2 8240 1726773068.58018: Calling groups_inventory to load vars for managed_node2 8240 1726773068.58019: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773068.58028: Calling all_plugins_play to load vars for managed_node2 8240 1726773068.58030: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773068.58032: Calling groups_plugins_play to load vars for managed_node2 8240 1726773068.58180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773068.58293: done with get_vars() 8240 1726773068.58301: done getting variables 8240 1726773068.58345: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:109 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.327) 0:00:47.227 **** 8240 1726773068.58366: entering _queue_task() for managed_node2/shell 8240 1726773068.58540: worker is 1 (out of 1 available) 8240 1726773068.58555: exiting _queue_task() for managed_node2/shell 8240 1726773068.58567: done queuing things up, now waiting for results queue to drain 8240 1726773068.58568: waiting for pending results... 10336 1726773068.58694: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10336 1726773068.58798: in run() - task 0affffe7-6841-885f-bbcf-000000000018 10336 1726773068.58817: variable 'ansible_search_path' from source: unknown 10336 1726773068.58846: calling self._execute() 10336 1726773068.58919: variable 'ansible_host' from source: host vars for 'managed_node2' 10336 1726773068.58928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10336 1726773068.58937: variable 'omit' from source: magic vars 10336 1726773068.59015: variable 'omit' from source: magic vars 10336 1726773068.59040: variable 'omit' from source: magic vars 10336 1726773068.59065: variable 'omit' from source: magic vars 10336 1726773068.59102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10336 1726773068.59131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10336 1726773068.59151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10336 1726773068.59166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10336 1726773068.59181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10336 1726773068.59206: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10336 1726773068.59211: variable 'ansible_host' from source: host vars for 'managed_node2' 10336 1726773068.59215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10336 1726773068.59289: Set connection var ansible_pipelining to False 10336 1726773068.59297: Set connection var ansible_timeout to 10 10336 1726773068.59304: Set connection var ansible_module_compression to ZIP_DEFLATED 10336 1726773068.59308: Set connection var ansible_shell_type to sh 10336 1726773068.59313: Set connection var ansible_shell_executable to /bin/sh 10336 1726773068.59317: Set connection var ansible_connection to ssh 10336 1726773068.59334: variable 'ansible_shell_executable' from source: unknown 10336 1726773068.59339: variable 'ansible_connection' from source: unknown 10336 1726773068.59342: variable 'ansible_module_compression' from source: unknown 10336 1726773068.59346: variable 'ansible_shell_type' from source: unknown 10336 1726773068.59349: variable 'ansible_shell_executable' from source: unknown 10336 1726773068.59350: variable 'ansible_host' from source: host vars for 'managed_node2' 10336 1726773068.59352: variable 'ansible_pipelining' from source: unknown 10336 1726773068.59354: variable 'ansible_timeout' from source: unknown 10336 1726773068.59356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10336 1726773068.59452: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10336 1726773068.59463: variable 'omit' from source: magic vars 10336 1726773068.59468: starting attempt loop 10336 1726773068.59472: running the handler 10336 1726773068.59479: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10336 1726773068.59496: _low_level_execute_command(): starting 10336 1726773068.59505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10336 1726773068.61857: stdout chunk (state=2): >>>/root <<< 10336 1726773068.61984: stderr chunk (state=3): >>><<< 10336 1726773068.61994: stdout chunk (state=3): >>><<< 10336 1726773068.62014: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10336 1726773068.62030: _low_level_execute_command(): starting 10336 1726773068.62038: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964 `" && echo ansible-tmp-1726773068.6202464-10336-16508019853964="` echo /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964 `" ) && sleep 0' 10336 1726773068.64541: stdout chunk (state=2): >>>ansible-tmp-1726773068.6202464-10336-16508019853964=/root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964 <<< 10336 1726773068.64674: stderr chunk (state=3): >>><<< 10336 1726773068.64682: stdout chunk (state=3): >>><<< 10336 1726773068.64701: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.6202464-10336-16508019853964=/root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964 , stderr= 10336 1726773068.64727: variable 'ansible_module_compression' from source: unknown 10336 1726773068.64778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10336 1726773068.64811: variable 'ansible_facts' from source: unknown 10336 1726773068.64890: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/AnsiballZ_command.py 10336 1726773068.64996: Sending initial data 10336 1726773068.65003: Sent initial data (154 bytes) 10336 1726773068.67532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp6gm6gkoo /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/AnsiballZ_command.py <<< 10336 1726773068.68663: stderr chunk (state=3): >>><<< 10336 1726773068.68675: stdout chunk (state=3): >>><<< 10336 1726773068.68698: done transferring module to remote 10336 1726773068.68710: _low_level_execute_command(): starting 10336 1726773068.68716: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/ /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/AnsiballZ_command.py && sleep 0' 10336 1726773068.71108: stderr chunk (state=2): >>><<< 10336 1726773068.71120: stdout chunk (state=2): >>><<< 10336 1726773068.71137: _low_level_execute_command() done: rc=0, stdout=, stderr= 10336 1726773068.71143: _low_level_execute_command(): starting 10336 1726773068.71148: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/AnsiballZ_command.py && sleep 0' 10336 1726773068.86758: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 15:11:08.202361", "end": "2024-09-19 15:11:08.208330", "delta": "0:00:00.005969", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10336 1726773068.87930: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10336 1726773068.87983: stderr chunk (state=3): >>><<< 10336 1726773068.87991: stdout chunk (state=3): >>><<< 10336 1726773068.88009: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "start": "2024-09-19 15:11:08.202361", "end": "2024-09-19 15:11:08.208330", "delta": "0:00:00.005969", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10336 1726773068.88052: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10336 1726773068.88062: _low_level_execute_command(): starting 10336 1726773068.88069: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.6202464-10336-16508019853964/ > /dev/null 2>&1 && sleep 0' 10336 1726773068.90496: stderr chunk (state=2): >>><<< 10336 1726773068.90506: stdout chunk (state=2): >>><<< 10336 1726773068.90521: _low_level_execute_command() done: rc=0, stdout=, stderr= 10336 1726773068.90529: handler run complete 10336 1726773068.90547: Evaluated conditional (False): False 10336 1726773068.90557: attempt loop complete, returning result 10336 1726773068.90560: _execute() done 10336 1726773068.90563: dumping result to json 10336 1726773068.90568: done dumping result, returning 10336 1726773068.90577: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [0affffe7-6841-885f-bbcf-000000000018] 10336 1726773068.90584: sending task result for task 0affffe7-6841-885f-bbcf-000000000018 10336 1726773068.90616: done sending task result for task 0affffe7-6841-885f-bbcf-000000000018 10336 1726773068.90619: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n kernel.threads-max | grep -Lxvq 29968", "delta": "0:00:00.005969", "end": "2024-09-19 15:11:08.208330", "rc": 0, "start": "2024-09-19 15:11:08.202361" } 8240 1726773068.90758: no more pending results, returning what we have 8240 1726773068.90761: results queue empty 8240 1726773068.90762: checking for any_errors_fatal 8240 1726773068.90769: done checking for any_errors_fatal 8240 1726773068.90770: checking for max_fail_percentage 8240 1726773068.90772: done checking for max_fail_percentage 8240 1726773068.90772: checking to see if all hosts have failed and the running result is not ok 8240 1726773068.90773: done checking to see if all hosts have failed 8240 1726773068.90774: getting the remaining hosts for this loop 8240 1726773068.90775: done getting the remaining hosts for this loop 8240 1726773068.90778: getting the next task for host managed_node2 8240 1726773068.90784: done getting next task for host managed_node2 8240 1726773068.90787: ^ task is: TASK: Check with tuned verify 8240 1726773068.90789: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773068.90792: getting variables 8240 1726773068.90794: in VariableManager get_vars() 8240 1726773068.90827: Calling all_inventory to load vars for managed_node2 8240 1726773068.90830: Calling groups_inventory to load vars for managed_node2 8240 1726773068.90832: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773068.90842: Calling all_plugins_play to load vars for managed_node2 8240 1726773068.90845: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773068.90847: Calling groups_plugins_play to load vars for managed_node2 8240 1726773068.90958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773068.91067: done with get_vars() 8240 1726773068.91077: done getting variables 8240 1726773068.91123: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check with tuned verify] ************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:115 Thursday 19 September 2024 15:11:08 -0400 (0:00:00.327) 0:00:47.555 **** 8240 1726773068.91145: entering _queue_task() for managed_node2/command 8240 1726773068.91308: worker is 1 (out of 1 available) 8240 1726773068.91322: exiting _queue_task() for managed_node2/command 8240 1726773068.91334: done queuing things up, now waiting for results queue to drain 8240 1726773068.91336: waiting for pending results... 10344 1726773068.91457: running TaskExecutor() for managed_node2/TASK: Check with tuned verify 10344 1726773068.91551: in run() - task 0affffe7-6841-885f-bbcf-000000000019 10344 1726773068.91568: variable 'ansible_search_path' from source: unknown 10344 1726773068.91601: calling self._execute() 10344 1726773068.91667: variable 'ansible_host' from source: host vars for 'managed_node2' 10344 1726773068.91678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10344 1726773068.91688: variable 'omit' from source: magic vars 10344 1726773068.91761: variable 'omit' from source: magic vars 10344 1726773068.91790: variable 'omit' from source: magic vars 10344 1726773068.91812: variable 'omit' from source: magic vars 10344 1726773068.91844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10344 1726773068.91872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10344 1726773068.91894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10344 1726773068.91910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10344 1726773068.91922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10344 1726773068.91946: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10344 1726773068.91952: variable 'ansible_host' from source: host vars for 'managed_node2' 10344 1726773068.91957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10344 1726773068.92029: Set connection var ansible_pipelining to False 10344 1726773068.92037: Set connection var ansible_timeout to 10 10344 1726773068.92044: Set connection var ansible_module_compression to ZIP_DEFLATED 10344 1726773068.92048: Set connection var ansible_shell_type to sh 10344 1726773068.92052: Set connection var ansible_shell_executable to /bin/sh 10344 1726773068.92059: Set connection var ansible_connection to ssh 10344 1726773068.92075: variable 'ansible_shell_executable' from source: unknown 10344 1726773068.92079: variable 'ansible_connection' from source: unknown 10344 1726773068.92081: variable 'ansible_module_compression' from source: unknown 10344 1726773068.92083: variable 'ansible_shell_type' from source: unknown 10344 1726773068.92084: variable 'ansible_shell_executable' from source: unknown 10344 1726773068.92088: variable 'ansible_host' from source: host vars for 'managed_node2' 10344 1726773068.92090: variable 'ansible_pipelining' from source: unknown 10344 1726773068.92092: variable 'ansible_timeout' from source: unknown 10344 1726773068.92094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10344 1726773068.92189: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10344 1726773068.92202: variable 'omit' from source: magic vars 10344 1726773068.92209: starting attempt loop 10344 1726773068.92213: running the handler 10344 1726773068.92225: _low_level_execute_command(): starting 10344 1726773068.92233: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10344 1726773068.94531: stdout chunk (state=2): >>>/root <<< 10344 1726773068.94675: stderr chunk (state=3): >>><<< 10344 1726773068.94683: stdout chunk (state=3): >>><<< 10344 1726773068.94706: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10344 1726773068.94722: _low_level_execute_command(): starting 10344 1726773068.94728: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616 `" && echo ansible-tmp-1726773068.9471524-10344-228573240806616="` echo /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616 `" ) && sleep 0' 10344 1726773068.97372: stdout chunk (state=2): >>>ansible-tmp-1726773068.9471524-10344-228573240806616=/root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616 <<< 10344 1726773068.97501: stderr chunk (state=3): >>><<< 10344 1726773068.97508: stdout chunk (state=3): >>><<< 10344 1726773068.97523: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773068.9471524-10344-228573240806616=/root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616 , stderr= 10344 1726773068.97550: variable 'ansible_module_compression' from source: unknown 10344 1726773068.97597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10344 1726773068.97627: variable 'ansible_facts' from source: unknown 10344 1726773068.97705: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/AnsiballZ_command.py 10344 1726773068.97808: Sending initial data 10344 1726773068.97815: Sent initial data (155 bytes) 10344 1726773069.00316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmph62q1swp /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/AnsiballZ_command.py <<< 10344 1726773069.01442: stderr chunk (state=3): >>><<< 10344 1726773069.01452: stdout chunk (state=3): >>><<< 10344 1726773069.01476: done transferring module to remote 10344 1726773069.01490: _low_level_execute_command(): starting 10344 1726773069.01495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/ /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/AnsiballZ_command.py && sleep 0' 10344 1726773069.03853: stderr chunk (state=2): >>><<< 10344 1726773069.03863: stdout chunk (state=2): >>><<< 10344 1726773069.03879: _low_level_execute_command() done: rc=0, stdout=, stderr= 10344 1726773069.03884: _low_level_execute_command(): starting 10344 1726773069.03891: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/AnsiballZ_command.py && sleep 0' 10344 1726773069.29591: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:08.529458", "end": "2024-09-19 15:11:08.633681", "delta": "0:00:00.104223", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10344 1726773069.30518: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10344 1726773069.30564: stderr chunk (state=3): >>><<< 10344 1726773069.30571: stdout chunk (state=3): >>><<< 10344 1726773069.30590: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:08.529458", "end": "2024-09-19 15:11:08.633681", "delta": "0:00:00.104223", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10344 1726773069.30631: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10344 1726773069.30641: _low_level_execute_command(): starting 10344 1726773069.30648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773068.9471524-10344-228573240806616/ > /dev/null 2>&1 && sleep 0' 10344 1726773069.33149: stderr chunk (state=2): >>><<< 10344 1726773069.33158: stdout chunk (state=2): >>><<< 10344 1726773069.33173: _low_level_execute_command() done: rc=0, stdout=, stderr= 10344 1726773069.33180: handler run complete 10344 1726773069.33200: Evaluated conditional (False): False 10344 1726773069.33209: attempt loop complete, returning result 10344 1726773069.33213: _execute() done 10344 1726773069.33216: dumping result to json 10344 1726773069.33221: done dumping result, returning 10344 1726773069.33228: done running TaskExecutor() for managed_node2/TASK: Check with tuned verify [0affffe7-6841-885f-bbcf-000000000019] 10344 1726773069.33234: sending task result for task 0affffe7-6841-885f-bbcf-000000000019 10344 1726773069.33263: done sending task result for task 0affffe7-6841-885f-bbcf-000000000019 10344 1726773069.33267: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.104223", "end": "2024-09-19 15:11:08.633681", "rc": 0, "start": "2024-09-19 15:11:08.529458" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773069.33430: no more pending results, returning what we have 8240 1726773069.33433: results queue empty 8240 1726773069.33434: checking for any_errors_fatal 8240 1726773069.33441: done checking for any_errors_fatal 8240 1726773069.33441: checking for max_fail_percentage 8240 1726773069.33443: done checking for max_fail_percentage 8240 1726773069.33443: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.33444: done checking to see if all hosts have failed 8240 1726773069.33445: getting the remaining hosts for this loop 8240 1726773069.33446: done getting the remaining hosts for this loop 8240 1726773069.33451: getting the next task for host managed_node2 8240 1726773069.33458: done getting next task for host managed_node2 8240 1726773069.33459: ^ task is: TASK: Apply role again and remove settings 8240 1726773069.33461: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.33464: getting variables 8240 1726773069.33466: in VariableManager get_vars() 8240 1726773069.33501: Calling all_inventory to load vars for managed_node2 8240 1726773069.33503: Calling groups_inventory to load vars for managed_node2 8240 1726773069.33504: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.33513: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.33515: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.33516: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.33667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.33781: done with get_vars() 8240 1726773069.33791: done getting variables TASK [Apply role again and remove settings] ************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:119 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.427) 0:00:47.982 **** 8240 1726773069.33857: entering _queue_task() for managed_node2/include_role 8240 1726773069.34027: worker is 1 (out of 1 available) 8240 1726773069.34042: exiting _queue_task() for managed_node2/include_role 8240 1726773069.34056: done queuing things up, now waiting for results queue to drain 8240 1726773069.34058: waiting for pending results... 10355 1726773069.34174: running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings 10355 1726773069.34272: in run() - task 0affffe7-6841-885f-bbcf-00000000001a 10355 1726773069.34290: variable 'ansible_search_path' from source: unknown 10355 1726773069.34320: calling self._execute() 10355 1726773069.34387: variable 'ansible_host' from source: host vars for 'managed_node2' 10355 1726773069.34396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10355 1726773069.34405: variable 'omit' from source: magic vars 10355 1726773069.34474: _execute() done 10355 1726773069.34479: dumping result to json 10355 1726773069.34482: done dumping result, returning 10355 1726773069.34488: done running TaskExecutor() for managed_node2/TASK: Apply role again and remove settings [0affffe7-6841-885f-bbcf-00000000001a] 10355 1726773069.34495: sending task result for task 0affffe7-6841-885f-bbcf-00000000001a 10355 1726773069.34528: done sending task result for task 0affffe7-6841-885f-bbcf-00000000001a 10355 1726773069.34531: WORKER PROCESS EXITING 8240 1726773069.34644: no more pending results, returning what we have 8240 1726773069.34648: in VariableManager get_vars() 8240 1726773069.34684: Calling all_inventory to load vars for managed_node2 8240 1726773069.34688: Calling groups_inventory to load vars for managed_node2 8240 1726773069.34690: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.34699: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.34701: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.34703: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.34809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.34918: done with get_vars() 8240 1726773069.34924: variable 'ansible_search_path' from source: unknown 8240 1726773069.35925: variable 'omit' from source: magic vars 8240 1726773069.35939: variable 'omit' from source: magic vars 8240 1726773069.35949: variable 'omit' from source: magic vars 8240 1726773069.35952: we have included files to process 8240 1726773069.35952: generating all_blocks data 8240 1726773069.35953: done generating all_blocks data 8240 1726773069.35956: processing included file: fedora.linux_system_roles.kernel_settings 8240 1726773069.35972: in VariableManager get_vars() 8240 1726773069.35983: done with get_vars() 8240 1726773069.36005: in VariableManager get_vars() 8240 1726773069.36016: done with get_vars() 8240 1726773069.36044: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8240 1726773069.36084: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8240 1726773069.36114: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8240 1726773069.36160: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8240 1726773069.36473: in VariableManager get_vars() 8240 1726773069.36490: done with get_vars() 8240 1726773069.37315: in VariableManager get_vars() 8240 1726773069.37329: done with get_vars() 8240 1726773069.37435: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8240 1726773069.37976: iterating over new_blocks loaded from include file 8240 1726773069.37978: in VariableManager get_vars() 8240 1726773069.37990: done with get_vars() 8240 1726773069.37992: filtering new block on tags 8240 1726773069.38013: done filtering new block on tags 8240 1726773069.38015: in VariableManager get_vars() 8240 1726773069.38023: done with get_vars() 8240 1726773069.38024: filtering new block on tags 8240 1726773069.38045: done filtering new block on tags 8240 1726773069.38048: in VariableManager get_vars() 8240 1726773069.38056: done with get_vars() 8240 1726773069.38057: filtering new block on tags 8240 1726773069.38144: done filtering new block on tags 8240 1726773069.38146: in VariableManager get_vars() 8240 1726773069.38155: done with get_vars() 8240 1726773069.38157: filtering new block on tags 8240 1726773069.38168: done filtering new block on tags 8240 1726773069.38169: done iterating over new_blocks loaded from include file 8240 1726773069.38172: extending task lists for all hosts with included blocks 8240 1726773069.39104: done extending task lists 8240 1726773069.39105: done processing included files 8240 1726773069.39106: results queue empty 8240 1726773069.39106: checking for any_errors_fatal 8240 1726773069.39109: done checking for any_errors_fatal 8240 1726773069.39109: checking for max_fail_percentage 8240 1726773069.39110: done checking for max_fail_percentage 8240 1726773069.39110: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.39111: done checking to see if all hosts have failed 8240 1726773069.39111: getting the remaining hosts for this loop 8240 1726773069.39112: done getting the remaining hosts for this loop 8240 1726773069.39113: getting the next task for host managed_node2 8240 1726773069.39116: done getting next task for host managed_node2 8240 1726773069.39118: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8240 1726773069.39119: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.39127: getting variables 8240 1726773069.39127: in VariableManager get_vars() 8240 1726773069.39137: Calling all_inventory to load vars for managed_node2 8240 1726773069.39138: Calling groups_inventory to load vars for managed_node2 8240 1726773069.39139: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.39143: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.39145: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.39146: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.39226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.39347: done with get_vars() 8240 1726773069.39353: done getting variables 8240 1726773069.39382: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.055) 0:00:48.037 **** 8240 1726773069.39404: entering _queue_task() for managed_node2/fail 8240 1726773069.39586: worker is 1 (out of 1 available) 8240 1726773069.39599: exiting _queue_task() for managed_node2/fail 8240 1726773069.39611: done queuing things up, now waiting for results queue to drain 8240 1726773069.39613: waiting for pending results... 10356 1726773069.39730: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10356 1726773069.39840: in run() - task 0affffe7-6841-885f-bbcf-0000000002ff 10356 1726773069.39855: variable 'ansible_search_path' from source: unknown 10356 1726773069.39860: variable 'ansible_search_path' from source: unknown 10356 1726773069.39890: calling self._execute() 10356 1726773069.39959: variable 'ansible_host' from source: host vars for 'managed_node2' 10356 1726773069.39967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10356 1726773069.39976: variable 'omit' from source: magic vars 10356 1726773069.40315: variable 'kernel_settings_sysctl' from source: include params 10356 1726773069.40329: variable '__kernel_settings_state_empty' from source: role '' all vars 10356 1726773069.40338: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10356 1726773069.40534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10356 1726773069.42029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10356 1726773069.42075: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10356 1726773069.42106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10356 1726773069.42133: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10356 1726773069.42152: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10356 1726773069.42208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10356 1726773069.42230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10356 1726773069.42248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10356 1726773069.42276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10356 1726773069.42290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10356 1726773069.42328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10356 1726773069.42346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10356 1726773069.42363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10356 1726773069.42391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10356 1726773069.42404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10356 1726773069.42433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10356 1726773069.42450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10356 1726773069.42467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10356 1726773069.42495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10356 1726773069.42507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10356 1726773069.42682: variable 'kernel_settings_sysctl' from source: include params 10356 1726773069.42735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10356 1726773069.42854: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10356 1726773069.42883: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10356 1726773069.42908: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10356 1726773069.42930: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10356 1726773069.42960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10356 1726773069.42976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10356 1726773069.42999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10356 1726773069.43017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10356 1726773069.43038: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10356 1726773069.43043: when evaluation is False, skipping this task 10356 1726773069.43047: _execute() done 10356 1726773069.43050: dumping result to json 10356 1726773069.43054: done dumping result, returning 10356 1726773069.43061: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-885f-bbcf-0000000002ff] 10356 1726773069.43066: sending task result for task 0affffe7-6841-885f-bbcf-0000000002ff 10356 1726773069.43089: done sending task result for task 0affffe7-6841-885f-bbcf-0000000002ff 10356 1726773069.43093: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8240 1726773069.43228: no more pending results, returning what we have 8240 1726773069.43231: results queue empty 8240 1726773069.43232: checking for any_errors_fatal 8240 1726773069.43234: done checking for any_errors_fatal 8240 1726773069.43234: checking for max_fail_percentage 8240 1726773069.43236: done checking for max_fail_percentage 8240 1726773069.43236: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.43237: done checking to see if all hosts have failed 8240 1726773069.43238: getting the remaining hosts for this loop 8240 1726773069.43239: done getting the remaining hosts for this loop 8240 1726773069.43242: getting the next task for host managed_node2 8240 1726773069.43248: done getting next task for host managed_node2 8240 1726773069.43252: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8240 1726773069.43254: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.43269: getting variables 8240 1726773069.43273: in VariableManager get_vars() 8240 1726773069.43309: Calling all_inventory to load vars for managed_node2 8240 1726773069.43312: Calling groups_inventory to load vars for managed_node2 8240 1726773069.43314: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.43323: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.43326: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.43328: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.43438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.43560: done with get_vars() 8240 1726773069.43568: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.042) 0:00:48.080 **** 8240 1726773069.43636: entering _queue_task() for managed_node2/include_tasks 8240 1726773069.43800: worker is 1 (out of 1 available) 8240 1726773069.43815: exiting _queue_task() for managed_node2/include_tasks 8240 1726773069.43828: done queuing things up, now waiting for results queue to drain 8240 1726773069.43829: waiting for pending results... 10357 1726773069.43948: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10357 1726773069.44054: in run() - task 0affffe7-6841-885f-bbcf-000000000300 10357 1726773069.44071: variable 'ansible_search_path' from source: unknown 10357 1726773069.44075: variable 'ansible_search_path' from source: unknown 10357 1726773069.44104: calling self._execute() 10357 1726773069.44168: variable 'ansible_host' from source: host vars for 'managed_node2' 10357 1726773069.44178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10357 1726773069.44188: variable 'omit' from source: magic vars 10357 1726773069.44258: _execute() done 10357 1726773069.44264: dumping result to json 10357 1726773069.44268: done dumping result, returning 10357 1726773069.44274: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-885f-bbcf-000000000300] 10357 1726773069.44281: sending task result for task 0affffe7-6841-885f-bbcf-000000000300 10357 1726773069.44306: done sending task result for task 0affffe7-6841-885f-bbcf-000000000300 10357 1726773069.44310: WORKER PROCESS EXITING 8240 1726773069.44419: no more pending results, returning what we have 8240 1726773069.44423: in VariableManager get_vars() 8240 1726773069.44460: Calling all_inventory to load vars for managed_node2 8240 1726773069.44463: Calling groups_inventory to load vars for managed_node2 8240 1726773069.44464: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.44474: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.44476: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.44478: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.44621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.44732: done with get_vars() 8240 1726773069.44737: variable 'ansible_search_path' from source: unknown 8240 1726773069.44738: variable 'ansible_search_path' from source: unknown 8240 1726773069.44761: we have included files to process 8240 1726773069.44761: generating all_blocks data 8240 1726773069.44762: done generating all_blocks data 8240 1726773069.44767: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773069.44767: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773069.44768: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8240 1726773069.45224: done processing included file 8240 1726773069.45226: iterating over new_blocks loaded from include file 8240 1726773069.45226: in VariableManager get_vars() 8240 1726773069.45240: done with get_vars() 8240 1726773069.45241: filtering new block on tags 8240 1726773069.45259: done filtering new block on tags 8240 1726773069.45260: in VariableManager get_vars() 8240 1726773069.45274: done with get_vars() 8240 1726773069.45275: filtering new block on tags 8240 1726773069.45298: done filtering new block on tags 8240 1726773069.45300: in VariableManager get_vars() 8240 1726773069.45314: done with get_vars() 8240 1726773069.45315: filtering new block on tags 8240 1726773069.45335: done filtering new block on tags 8240 1726773069.45337: in VariableManager get_vars() 8240 1726773069.45349: done with get_vars() 8240 1726773069.45350: filtering new block on tags 8240 1726773069.45365: done filtering new block on tags 8240 1726773069.45366: done iterating over new_blocks loaded from include file 8240 1726773069.45367: extending task lists for all hosts with included blocks 8240 1726773069.45460: done extending task lists 8240 1726773069.45461: done processing included files 8240 1726773069.45461: results queue empty 8240 1726773069.45462: checking for any_errors_fatal 8240 1726773069.45464: done checking for any_errors_fatal 8240 1726773069.45465: checking for max_fail_percentage 8240 1726773069.45465: done checking for max_fail_percentage 8240 1726773069.45466: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.45467: done checking to see if all hosts have failed 8240 1726773069.45467: getting the remaining hosts for this loop 8240 1726773069.45468: done getting the remaining hosts for this loop 8240 1726773069.45470: getting the next task for host managed_node2 8240 1726773069.45473: done getting next task for host managed_node2 8240 1726773069.45474: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8240 1726773069.45476: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.45483: getting variables 8240 1726773069.45483: in VariableManager get_vars() 8240 1726773069.45493: Calling all_inventory to load vars for managed_node2 8240 1726773069.45494: Calling groups_inventory to load vars for managed_node2 8240 1726773069.45496: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.45499: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.45500: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.45501: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.45598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.45705: done with get_vars() 8240 1726773069.45711: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.021) 0:00:48.101 **** 8240 1726773069.45755: entering _queue_task() for managed_node2/setup 8240 1726773069.45904: worker is 1 (out of 1 available) 8240 1726773069.45918: exiting _queue_task() for managed_node2/setup 8240 1726773069.45930: done queuing things up, now waiting for results queue to drain 8240 1726773069.45931: waiting for pending results... 10358 1726773069.46048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10358 1726773069.46166: in run() - task 0affffe7-6841-885f-bbcf-000000000413 10358 1726773069.46182: variable 'ansible_search_path' from source: unknown 10358 1726773069.46188: variable 'ansible_search_path' from source: unknown 10358 1726773069.46214: calling self._execute() 10358 1726773069.46277: variable 'ansible_host' from source: host vars for 'managed_node2' 10358 1726773069.46286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10358 1726773069.46295: variable 'omit' from source: magic vars 10358 1726773069.46647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10358 1726773069.48177: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10358 1726773069.48229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10358 1726773069.48258: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10358 1726773069.48290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10358 1726773069.48310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10358 1726773069.48367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10358 1726773069.48392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10358 1726773069.48411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10358 1726773069.48440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10358 1726773069.48452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10358 1726773069.48494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10358 1726773069.48512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10358 1726773069.48528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10358 1726773069.48556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10358 1726773069.48568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10358 1726773069.48693: variable '__kernel_settings_required_facts' from source: role '' all vars 10358 1726773069.48705: variable 'ansible_facts' from source: unknown 10358 1726773069.48765: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10358 1726773069.48772: when evaluation is False, skipping this task 10358 1726773069.48777: _execute() done 10358 1726773069.48781: dumping result to json 10358 1726773069.48786: done dumping result, returning 10358 1726773069.48794: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-885f-bbcf-000000000413] 10358 1726773069.48799: sending task result for task 0affffe7-6841-885f-bbcf-000000000413 10358 1726773069.48822: done sending task result for task 0affffe7-6841-885f-bbcf-000000000413 10358 1726773069.48825: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8240 1726773069.48940: no more pending results, returning what we have 8240 1726773069.48943: results queue empty 8240 1726773069.48944: checking for any_errors_fatal 8240 1726773069.48946: done checking for any_errors_fatal 8240 1726773069.48947: checking for max_fail_percentage 8240 1726773069.48948: done checking for max_fail_percentage 8240 1726773069.48949: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.48950: done checking to see if all hosts have failed 8240 1726773069.48950: getting the remaining hosts for this loop 8240 1726773069.48951: done getting the remaining hosts for this loop 8240 1726773069.48955: getting the next task for host managed_node2 8240 1726773069.48963: done getting next task for host managed_node2 8240 1726773069.48967: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8240 1726773069.48971: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.48988: getting variables 8240 1726773069.48989: in VariableManager get_vars() 8240 1726773069.49026: Calling all_inventory to load vars for managed_node2 8240 1726773069.49029: Calling groups_inventory to load vars for managed_node2 8240 1726773069.49031: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.49040: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.49043: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.49045: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.49165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.49291: done with get_vars() 8240 1726773069.49299: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.036) 0:00:48.137 **** 8240 1726773069.49365: entering _queue_task() for managed_node2/stat 8240 1726773069.49531: worker is 1 (out of 1 available) 8240 1726773069.49545: exiting _queue_task() for managed_node2/stat 8240 1726773069.49557: done queuing things up, now waiting for results queue to drain 8240 1726773069.49558: waiting for pending results... 10359 1726773069.49687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10359 1726773069.49808: in run() - task 0affffe7-6841-885f-bbcf-000000000415 10359 1726773069.49824: variable 'ansible_search_path' from source: unknown 10359 1726773069.49829: variable 'ansible_search_path' from source: unknown 10359 1726773069.49856: calling self._execute() 10359 1726773069.49924: variable 'ansible_host' from source: host vars for 'managed_node2' 10359 1726773069.49932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10359 1726773069.49940: variable 'omit' from source: magic vars 10359 1726773069.50319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10359 1726773069.50490: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10359 1726773069.50522: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10359 1726773069.50548: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10359 1726773069.50575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10359 1726773069.50633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10359 1726773069.50654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10359 1726773069.50677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10359 1726773069.50698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10359 1726773069.50782: variable '__kernel_settings_is_ostree' from source: set_fact 10359 1726773069.50796: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10359 1726773069.50800: when evaluation is False, skipping this task 10359 1726773069.50804: _execute() done 10359 1726773069.50807: dumping result to json 10359 1726773069.50811: done dumping result, returning 10359 1726773069.50817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-885f-bbcf-000000000415] 10359 1726773069.50822: sending task result for task 0affffe7-6841-885f-bbcf-000000000415 10359 1726773069.50845: done sending task result for task 0affffe7-6841-885f-bbcf-000000000415 10359 1726773069.50848: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773069.50956: no more pending results, returning what we have 8240 1726773069.50959: results queue empty 8240 1726773069.50960: checking for any_errors_fatal 8240 1726773069.50967: done checking for any_errors_fatal 8240 1726773069.50968: checking for max_fail_percentage 8240 1726773069.50970: done checking for max_fail_percentage 8240 1726773069.50970: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.50971: done checking to see if all hosts have failed 8240 1726773069.50971: getting the remaining hosts for this loop 8240 1726773069.50973: done getting the remaining hosts for this loop 8240 1726773069.50976: getting the next task for host managed_node2 8240 1726773069.50982: done getting next task for host managed_node2 8240 1726773069.50987: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8240 1726773069.50991: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.51005: getting variables 8240 1726773069.51007: in VariableManager get_vars() 8240 1726773069.51083: Calling all_inventory to load vars for managed_node2 8240 1726773069.51088: Calling groups_inventory to load vars for managed_node2 8240 1726773069.51090: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.51097: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.51099: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.51101: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.51200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.51316: done with get_vars() 8240 1726773069.51323: done getting variables 8240 1726773069.51362: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.020) 0:00:48.157 **** 8240 1726773069.51391: entering _queue_task() for managed_node2/set_fact 8240 1726773069.51547: worker is 1 (out of 1 available) 8240 1726773069.51561: exiting _queue_task() for managed_node2/set_fact 8240 1726773069.51575: done queuing things up, now waiting for results queue to drain 8240 1726773069.51577: waiting for pending results... 10360 1726773069.51697: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10360 1726773069.51810: in run() - task 0affffe7-6841-885f-bbcf-000000000416 10360 1726773069.51826: variable 'ansible_search_path' from source: unknown 10360 1726773069.51830: variable 'ansible_search_path' from source: unknown 10360 1726773069.51856: calling self._execute() 10360 1726773069.51923: variable 'ansible_host' from source: host vars for 'managed_node2' 10360 1726773069.51932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10360 1726773069.51940: variable 'omit' from source: magic vars 10360 1726773069.52267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10360 1726773069.52441: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10360 1726773069.52480: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10360 1726773069.52529: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10360 1726773069.52555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10360 1726773069.52619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10360 1726773069.52639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10360 1726773069.52658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10360 1726773069.52680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10360 1726773069.52763: variable '__kernel_settings_is_ostree' from source: set_fact 10360 1726773069.52775: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10360 1726773069.52780: when evaluation is False, skipping this task 10360 1726773069.52784: _execute() done 10360 1726773069.52790: dumping result to json 10360 1726773069.52793: done dumping result, returning 10360 1726773069.52799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-000000000416] 10360 1726773069.52805: sending task result for task 0affffe7-6841-885f-bbcf-000000000416 10360 1726773069.52828: done sending task result for task 0affffe7-6841-885f-bbcf-000000000416 10360 1726773069.52832: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773069.52937: no more pending results, returning what we have 8240 1726773069.52940: results queue empty 8240 1726773069.52941: checking for any_errors_fatal 8240 1726773069.52946: done checking for any_errors_fatal 8240 1726773069.52947: checking for max_fail_percentage 8240 1726773069.52948: done checking for max_fail_percentage 8240 1726773069.52949: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.52950: done checking to see if all hosts have failed 8240 1726773069.52950: getting the remaining hosts for this loop 8240 1726773069.52952: done getting the remaining hosts for this loop 8240 1726773069.52955: getting the next task for host managed_node2 8240 1726773069.52962: done getting next task for host managed_node2 8240 1726773069.52966: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8240 1726773069.52969: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.52989: getting variables 8240 1726773069.52991: in VariableManager get_vars() 8240 1726773069.53024: Calling all_inventory to load vars for managed_node2 8240 1726773069.53027: Calling groups_inventory to load vars for managed_node2 8240 1726773069.53028: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.53035: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.53037: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.53038: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.53147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.53271: done with get_vars() 8240 1726773069.53279: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.019) 0:00:48.177 **** 8240 1726773069.53347: entering _queue_task() for managed_node2/stat 8240 1726773069.53505: worker is 1 (out of 1 available) 8240 1726773069.53518: exiting _queue_task() for managed_node2/stat 8240 1726773069.53531: done queuing things up, now waiting for results queue to drain 8240 1726773069.53533: waiting for pending results... 10361 1726773069.53646: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10361 1726773069.53764: in run() - task 0affffe7-6841-885f-bbcf-000000000418 10361 1726773069.53779: variable 'ansible_search_path' from source: unknown 10361 1726773069.53783: variable 'ansible_search_path' from source: unknown 10361 1726773069.53810: calling self._execute() 10361 1726773069.53873: variable 'ansible_host' from source: host vars for 'managed_node2' 10361 1726773069.53940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10361 1726773069.53951: variable 'omit' from source: magic vars 10361 1726773069.54255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10361 1726773069.54420: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10361 1726773069.54452: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10361 1726773069.54478: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10361 1726773069.54505: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10361 1726773069.54560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10361 1726773069.54582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10361 1726773069.54603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10361 1726773069.54621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10361 1726773069.54706: variable '__kernel_settings_is_transactional' from source: set_fact 10361 1726773069.54717: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10361 1726773069.54721: when evaluation is False, skipping this task 10361 1726773069.54725: _execute() done 10361 1726773069.54729: dumping result to json 10361 1726773069.54733: done dumping result, returning 10361 1726773069.54737: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-885f-bbcf-000000000418] 10361 1726773069.54742: sending task result for task 0affffe7-6841-885f-bbcf-000000000418 10361 1726773069.54761: done sending task result for task 0affffe7-6841-885f-bbcf-000000000418 10361 1726773069.54763: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773069.54983: no more pending results, returning what we have 8240 1726773069.54988: results queue empty 8240 1726773069.54989: checking for any_errors_fatal 8240 1726773069.54994: done checking for any_errors_fatal 8240 1726773069.54994: checking for max_fail_percentage 8240 1726773069.54996: done checking for max_fail_percentage 8240 1726773069.54996: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.54997: done checking to see if all hosts have failed 8240 1726773069.54997: getting the remaining hosts for this loop 8240 1726773069.54998: done getting the remaining hosts for this loop 8240 1726773069.55001: getting the next task for host managed_node2 8240 1726773069.55007: done getting next task for host managed_node2 8240 1726773069.55009: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8240 1726773069.55012: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.55023: getting variables 8240 1726773069.55024: in VariableManager get_vars() 8240 1726773069.55094: Calling all_inventory to load vars for managed_node2 8240 1726773069.55097: Calling groups_inventory to load vars for managed_node2 8240 1726773069.55098: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.55105: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.55107: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.55109: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.55206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.55322: done with get_vars() 8240 1726773069.55329: done getting variables 8240 1726773069.55369: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.020) 0:00:48.197 **** 8240 1726773069.55398: entering _queue_task() for managed_node2/set_fact 8240 1726773069.55551: worker is 1 (out of 1 available) 8240 1726773069.55563: exiting _queue_task() for managed_node2/set_fact 8240 1726773069.55578: done queuing things up, now waiting for results queue to drain 8240 1726773069.55580: waiting for pending results... 10362 1726773069.55694: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10362 1726773069.55810: in run() - task 0affffe7-6841-885f-bbcf-000000000419 10362 1726773069.55827: variable 'ansible_search_path' from source: unknown 10362 1726773069.55830: variable 'ansible_search_path' from source: unknown 10362 1726773069.55856: calling self._execute() 10362 1726773069.55924: variable 'ansible_host' from source: host vars for 'managed_node2' 10362 1726773069.55932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10362 1726773069.55942: variable 'omit' from source: magic vars 10362 1726773069.56259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10362 1726773069.56433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10362 1726773069.56490: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10362 1726773069.56517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10362 1726773069.56543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10362 1726773069.56603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10362 1726773069.56622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10362 1726773069.56641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10362 1726773069.56660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10362 1726773069.56747: variable '__kernel_settings_is_transactional' from source: set_fact 10362 1726773069.56759: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10362 1726773069.56763: when evaluation is False, skipping this task 10362 1726773069.56767: _execute() done 10362 1726773069.56771: dumping result to json 10362 1726773069.56775: done dumping result, returning 10362 1726773069.56780: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-885f-bbcf-000000000419] 10362 1726773069.56787: sending task result for task 0affffe7-6841-885f-bbcf-000000000419 10362 1726773069.56811: done sending task result for task 0affffe7-6841-885f-bbcf-000000000419 10362 1726773069.56815: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773069.56924: no more pending results, returning what we have 8240 1726773069.56927: results queue empty 8240 1726773069.56928: checking for any_errors_fatal 8240 1726773069.56933: done checking for any_errors_fatal 8240 1726773069.56933: checking for max_fail_percentage 8240 1726773069.56935: done checking for max_fail_percentage 8240 1726773069.56935: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.56936: done checking to see if all hosts have failed 8240 1726773069.56937: getting the remaining hosts for this loop 8240 1726773069.56938: done getting the remaining hosts for this loop 8240 1726773069.56942: getting the next task for host managed_node2 8240 1726773069.56950: done getting next task for host managed_node2 8240 1726773069.56953: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8240 1726773069.56957: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.56974: getting variables 8240 1726773069.56976: in VariableManager get_vars() 8240 1726773069.57007: Calling all_inventory to load vars for managed_node2 8240 1726773069.57010: Calling groups_inventory to load vars for managed_node2 8240 1726773069.57011: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.57018: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.57020: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.57022: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.57128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.57248: done with get_vars() 8240 1726773069.57255: done getting variables 8240 1726773069.57299: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.019) 0:00:48.217 **** 8240 1726773069.57323: entering _queue_task() for managed_node2/include_vars 8240 1726773069.57480: worker is 1 (out of 1 available) 8240 1726773069.57495: exiting _queue_task() for managed_node2/include_vars 8240 1726773069.57510: done queuing things up, now waiting for results queue to drain 8240 1726773069.57511: waiting for pending results... 10363 1726773069.57627: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10363 1726773069.57740: in run() - task 0affffe7-6841-885f-bbcf-00000000041b 10363 1726773069.57756: variable 'ansible_search_path' from source: unknown 10363 1726773069.57760: variable 'ansible_search_path' from source: unknown 10363 1726773069.57788: calling self._execute() 10363 1726773069.57907: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726773069.57916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726773069.57925: variable 'omit' from source: magic vars 10363 1726773069.57996: variable 'omit' from source: magic vars 10363 1726773069.58037: variable 'omit' from source: magic vars 10363 1726773069.58283: variable 'ffparams' from source: task vars 10363 1726773069.58373: variable 'ansible_facts' from source: unknown 10363 1726773069.58497: variable 'ansible_facts' from source: unknown 10363 1726773069.58582: variable 'ansible_facts' from source: unknown 10363 1726773069.58669: variable 'ansible_facts' from source: unknown 10363 1726773069.58747: variable 'role_path' from source: magic vars 10363 1726773069.58863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10363 1726773069.59008: Loaded config def from plugin (lookup/first_found) 10363 1726773069.59015: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10363 1726773069.59044: variable 'ansible_search_path' from source: unknown 10363 1726773069.59063: variable 'ansible_search_path' from source: unknown 10363 1726773069.59072: variable 'ansible_search_path' from source: unknown 10363 1726773069.59080: variable 'ansible_search_path' from source: unknown 10363 1726773069.59088: variable 'ansible_search_path' from source: unknown 10363 1726773069.59103: variable 'omit' from source: magic vars 10363 1726773069.59121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10363 1726773069.59139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10363 1726773069.59156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10363 1726773069.59169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10363 1726773069.59178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10363 1726773069.59201: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10363 1726773069.59207: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726773069.59211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726773069.59277: Set connection var ansible_pipelining to False 10363 1726773069.59285: Set connection var ansible_timeout to 10 10363 1726773069.59294: Set connection var ansible_module_compression to ZIP_DEFLATED 10363 1726773069.59297: Set connection var ansible_shell_type to sh 10363 1726773069.59302: Set connection var ansible_shell_executable to /bin/sh 10363 1726773069.59307: Set connection var ansible_connection to ssh 10363 1726773069.59321: variable 'ansible_shell_executable' from source: unknown 10363 1726773069.59325: variable 'ansible_connection' from source: unknown 10363 1726773069.59328: variable 'ansible_module_compression' from source: unknown 10363 1726773069.59331: variable 'ansible_shell_type' from source: unknown 10363 1726773069.59335: variable 'ansible_shell_executable' from source: unknown 10363 1726773069.59338: variable 'ansible_host' from source: host vars for 'managed_node2' 10363 1726773069.59343: variable 'ansible_pipelining' from source: unknown 10363 1726773069.59346: variable 'ansible_timeout' from source: unknown 10363 1726773069.59350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10363 1726773069.59422: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10363 1726773069.59434: variable 'omit' from source: magic vars 10363 1726773069.59440: starting attempt loop 10363 1726773069.59443: running the handler 10363 1726773069.59488: handler run complete 10363 1726773069.59499: attempt loop complete, returning result 10363 1726773069.59503: _execute() done 10363 1726773069.59506: dumping result to json 10363 1726773069.59510: done dumping result, returning 10363 1726773069.59516: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-885f-bbcf-00000000041b] 10363 1726773069.59521: sending task result for task 0affffe7-6841-885f-bbcf-00000000041b 10363 1726773069.59545: done sending task result for task 0affffe7-6841-885f-bbcf-00000000041b 10363 1726773069.59548: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8240 1726773069.59760: no more pending results, returning what we have 8240 1726773069.59762: results queue empty 8240 1726773069.59763: checking for any_errors_fatal 8240 1726773069.59767: done checking for any_errors_fatal 8240 1726773069.59767: checking for max_fail_percentage 8240 1726773069.59768: done checking for max_fail_percentage 8240 1726773069.59769: checking to see if all hosts have failed and the running result is not ok 8240 1726773069.59769: done checking to see if all hosts have failed 8240 1726773069.59770: getting the remaining hosts for this loop 8240 1726773069.59771: done getting the remaining hosts for this loop 8240 1726773069.59774: getting the next task for host managed_node2 8240 1726773069.59779: done getting next task for host managed_node2 8240 1726773069.59781: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8240 1726773069.59783: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773069.59793: getting variables 8240 1726773069.59794: in VariableManager get_vars() 8240 1726773069.59813: Calling all_inventory to load vars for managed_node2 8240 1726773069.59815: Calling groups_inventory to load vars for managed_node2 8240 1726773069.59816: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773069.59823: Calling all_plugins_play to load vars for managed_node2 8240 1726773069.59825: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773069.59826: Calling groups_plugins_play to load vars for managed_node2 8240 1726773069.59920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773069.60038: done with get_vars() 8240 1726773069.60045: done getting variables 8240 1726773069.60087: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:09 -0400 (0:00:00.027) 0:00:48.244 **** 8240 1726773069.60109: entering _queue_task() for managed_node2/package 8240 1726773069.60259: worker is 1 (out of 1 available) 8240 1726773069.60273: exiting _queue_task() for managed_node2/package 8240 1726773069.60287: done queuing things up, now waiting for results queue to drain 8240 1726773069.60289: waiting for pending results... 10364 1726773069.60408: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10364 1726773069.60517: in run() - task 0affffe7-6841-885f-bbcf-000000000301 10364 1726773069.60532: variable 'ansible_search_path' from source: unknown 10364 1726773069.60536: variable 'ansible_search_path' from source: unknown 10364 1726773069.60561: calling self._execute() 10364 1726773069.60625: variable 'ansible_host' from source: host vars for 'managed_node2' 10364 1726773069.60633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10364 1726773069.60642: variable 'omit' from source: magic vars 10364 1726773069.60714: variable 'omit' from source: magic vars 10364 1726773069.60748: variable 'omit' from source: magic vars 10364 1726773069.60768: variable '__kernel_settings_packages' from source: include_vars 10364 1726773069.60982: variable '__kernel_settings_packages' from source: include_vars 10364 1726773069.61134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10364 1726773069.62621: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10364 1726773069.62666: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10364 1726773069.62702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10364 1726773069.62728: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10364 1726773069.62748: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10364 1726773069.62984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10364 1726773069.63009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10364 1726773069.63029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10364 1726773069.63055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10364 1726773069.63067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10364 1726773069.63143: variable '__kernel_settings_is_ostree' from source: set_fact 10364 1726773069.63150: variable 'omit' from source: magic vars 10364 1726773069.63175: variable 'omit' from source: magic vars 10364 1726773069.63198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10364 1726773069.63220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10364 1726773069.63235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10364 1726773069.63246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10364 1726773069.63254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10364 1726773069.63277: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10364 1726773069.63281: variable 'ansible_host' from source: host vars for 'managed_node2' 10364 1726773069.63284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10364 1726773069.63365: Set connection var ansible_pipelining to False 10364 1726773069.63375: Set connection var ansible_timeout to 10 10364 1726773069.63383: Set connection var ansible_module_compression to ZIP_DEFLATED 10364 1726773069.63388: Set connection var ansible_shell_type to sh 10364 1726773069.63393: Set connection var ansible_shell_executable to /bin/sh 10364 1726773069.63398: Set connection var ansible_connection to ssh 10364 1726773069.63415: variable 'ansible_shell_executable' from source: unknown 10364 1726773069.63419: variable 'ansible_connection' from source: unknown 10364 1726773069.63422: variable 'ansible_module_compression' from source: unknown 10364 1726773069.63425: variable 'ansible_shell_type' from source: unknown 10364 1726773069.63428: variable 'ansible_shell_executable' from source: unknown 10364 1726773069.63431: variable 'ansible_host' from source: host vars for 'managed_node2' 10364 1726773069.63435: variable 'ansible_pipelining' from source: unknown 10364 1726773069.63437: variable 'ansible_timeout' from source: unknown 10364 1726773069.63440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10364 1726773069.63501: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10364 1726773069.63511: variable 'omit' from source: magic vars 10364 1726773069.63516: starting attempt loop 10364 1726773069.63518: running the handler 10364 1726773069.63578: variable 'ansible_facts' from source: unknown 10364 1726773069.63660: _low_level_execute_command(): starting 10364 1726773069.63669: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10364 1726773069.65993: stdout chunk (state=2): >>>/root <<< 10364 1726773069.66112: stderr chunk (state=3): >>><<< 10364 1726773069.66121: stdout chunk (state=3): >>><<< 10364 1726773069.66141: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10364 1726773069.66153: _low_level_execute_command(): starting 10364 1726773069.66159: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569 `" && echo ansible-tmp-1726773069.6614878-10364-4502239412569="` echo /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569 `" ) && sleep 0' 10364 1726773069.68694: stdout chunk (state=2): >>>ansible-tmp-1726773069.6614878-10364-4502239412569=/root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569 <<< 10364 1726773069.68832: stderr chunk (state=3): >>><<< 10364 1726773069.68839: stdout chunk (state=3): >>><<< 10364 1726773069.68857: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773069.6614878-10364-4502239412569=/root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569 , stderr= 10364 1726773069.68887: variable 'ansible_module_compression' from source: unknown 10364 1726773069.68933: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10364 1726773069.68972: variable 'ansible_facts' from source: unknown 10364 1726773069.69061: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/AnsiballZ_dnf.py 10364 1726773069.69165: Sending initial data 10364 1726773069.69175: Sent initial data (149 bytes) 10364 1726773069.71702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpufki81b5 /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/AnsiballZ_dnf.py <<< 10364 1726773069.73107: stderr chunk (state=3): >>><<< 10364 1726773069.73118: stdout chunk (state=3): >>><<< 10364 1726773069.73139: done transferring module to remote 10364 1726773069.73151: _low_level_execute_command(): starting 10364 1726773069.73156: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/ /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/AnsiballZ_dnf.py && sleep 0' 10364 1726773069.75555: stderr chunk (state=2): >>><<< 10364 1726773069.75566: stdout chunk (state=2): >>><<< 10364 1726773069.75583: _low_level_execute_command() done: rc=0, stdout=, stderr= 10364 1726773069.75590: _low_level_execute_command(): starting 10364 1726773069.75595: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/AnsiballZ_dnf.py && sleep 0' 10364 1726773074.54951: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10364 1726773074.62712: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10364 1726773074.62761: stderr chunk (state=3): >>><<< 10364 1726773074.62768: stdout chunk (state=3): >>><<< 10364 1726773074.62787: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10364 1726773074.62824: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10364 1726773074.62832: _low_level_execute_command(): starting 10364 1726773074.62839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773069.6614878-10364-4502239412569/ > /dev/null 2>&1 && sleep 0' 10364 1726773074.65337: stderr chunk (state=2): >>><<< 10364 1726773074.65347: stdout chunk (state=2): >>><<< 10364 1726773074.65363: _low_level_execute_command() done: rc=0, stdout=, stderr= 10364 1726773074.65374: handler run complete 10364 1726773074.65403: attempt loop complete, returning result 10364 1726773074.65407: _execute() done 10364 1726773074.65410: dumping result to json 10364 1726773074.65417: done dumping result, returning 10364 1726773074.65424: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-885f-bbcf-000000000301] 10364 1726773074.65430: sending task result for task 0affffe7-6841-885f-bbcf-000000000301 10364 1726773074.65461: done sending task result for task 0affffe7-6841-885f-bbcf-000000000301 10364 1726773074.65465: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773074.65635: no more pending results, returning what we have 8240 1726773074.65639: results queue empty 8240 1726773074.65639: checking for any_errors_fatal 8240 1726773074.65646: done checking for any_errors_fatal 8240 1726773074.65646: checking for max_fail_percentage 8240 1726773074.65648: done checking for max_fail_percentage 8240 1726773074.65648: checking to see if all hosts have failed and the running result is not ok 8240 1726773074.65649: done checking to see if all hosts have failed 8240 1726773074.65650: getting the remaining hosts for this loop 8240 1726773074.65651: done getting the remaining hosts for this loop 8240 1726773074.65654: getting the next task for host managed_node2 8240 1726773074.65662: done getting next task for host managed_node2 8240 1726773074.65665: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8240 1726773074.65668: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773074.65678: getting variables 8240 1726773074.65679: in VariableManager get_vars() 8240 1726773074.65715: Calling all_inventory to load vars for managed_node2 8240 1726773074.65718: Calling groups_inventory to load vars for managed_node2 8240 1726773074.65719: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773074.65729: Calling all_plugins_play to load vars for managed_node2 8240 1726773074.65732: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773074.65734: Calling groups_plugins_play to load vars for managed_node2 8240 1726773074.65845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773074.65995: done with get_vars() 8240 1726773074.66003: done getting variables 8240 1726773074.66045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:14 -0400 (0:00:05.059) 0:00:53.304 **** 8240 1726773074.66069: entering _queue_task() for managed_node2/debug 8240 1726773074.66234: worker is 1 (out of 1 available) 8240 1726773074.66247: exiting _queue_task() for managed_node2/debug 8240 1726773074.66260: done queuing things up, now waiting for results queue to drain 8240 1726773074.66262: waiting for pending results... 10449 1726773074.66389: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10449 1726773074.66499: in run() - task 0affffe7-6841-885f-bbcf-000000000303 10449 1726773074.66515: variable 'ansible_search_path' from source: unknown 10449 1726773074.66519: variable 'ansible_search_path' from source: unknown 10449 1726773074.66546: calling self._execute() 10449 1726773074.66616: variable 'ansible_host' from source: host vars for 'managed_node2' 10449 1726773074.66625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10449 1726773074.66633: variable 'omit' from source: magic vars 10449 1726773074.66977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10449 1726773074.68488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10449 1726773074.68719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10449 1726773074.68748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10449 1726773074.68775: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10449 1726773074.68798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10449 1726773074.68850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10449 1726773074.68873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10449 1726773074.68895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10449 1726773074.68922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10449 1726773074.68934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10449 1726773074.69007: variable '__kernel_settings_is_transactional' from source: set_fact 10449 1726773074.69023: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10449 1726773074.69027: when evaluation is False, skipping this task 10449 1726773074.69031: _execute() done 10449 1726773074.69035: dumping result to json 10449 1726773074.69038: done dumping result, returning 10449 1726773074.69044: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000303] 10449 1726773074.69050: sending task result for task 0affffe7-6841-885f-bbcf-000000000303 10449 1726773074.69072: done sending task result for task 0affffe7-6841-885f-bbcf-000000000303 10449 1726773074.69075: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8240 1726773074.69181: no more pending results, returning what we have 8240 1726773074.69184: results queue empty 8240 1726773074.69187: checking for any_errors_fatal 8240 1726773074.69194: done checking for any_errors_fatal 8240 1726773074.69195: checking for max_fail_percentage 8240 1726773074.69196: done checking for max_fail_percentage 8240 1726773074.69197: checking to see if all hosts have failed and the running result is not ok 8240 1726773074.69198: done checking to see if all hosts have failed 8240 1726773074.69198: getting the remaining hosts for this loop 8240 1726773074.69199: done getting the remaining hosts for this loop 8240 1726773074.69208: getting the next task for host managed_node2 8240 1726773074.69215: done getting next task for host managed_node2 8240 1726773074.69218: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8240 1726773074.69220: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773074.69235: getting variables 8240 1726773074.69237: in VariableManager get_vars() 8240 1726773074.69269: Calling all_inventory to load vars for managed_node2 8240 1726773074.69274: Calling groups_inventory to load vars for managed_node2 8240 1726773074.69276: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773074.69288: Calling all_plugins_play to load vars for managed_node2 8240 1726773074.69291: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773074.69293: Calling groups_plugins_play to load vars for managed_node2 8240 1726773074.69403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773074.69522: done with get_vars() 8240 1726773074.69530: done getting variables 8240 1726773074.69573: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.035) 0:00:53.339 **** 8240 1726773074.69598: entering _queue_task() for managed_node2/reboot 8240 1726773074.69756: worker is 1 (out of 1 available) 8240 1726773074.69768: exiting _queue_task() for managed_node2/reboot 8240 1726773074.69784: done queuing things up, now waiting for results queue to drain 8240 1726773074.69787: waiting for pending results... 10450 1726773074.69900: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10450 1726773074.70007: in run() - task 0affffe7-6841-885f-bbcf-000000000304 10450 1726773074.70023: variable 'ansible_search_path' from source: unknown 10450 1726773074.70027: variable 'ansible_search_path' from source: unknown 10450 1726773074.70053: calling self._execute() 10450 1726773074.70116: variable 'ansible_host' from source: host vars for 'managed_node2' 10450 1726773074.70125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10450 1726773074.70134: variable 'omit' from source: magic vars 10450 1726773074.70457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10450 1726773074.72127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10450 1726773074.72174: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10450 1726773074.72204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10450 1726773074.72231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10450 1726773074.72250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10450 1726773074.72307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10450 1726773074.72328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10450 1726773074.72347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10450 1726773074.72375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10450 1726773074.72390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10450 1726773074.72462: variable '__kernel_settings_is_transactional' from source: set_fact 10450 1726773074.72478: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10450 1726773074.72483: when evaluation is False, skipping this task 10450 1726773074.72489: _execute() done 10450 1726773074.72493: dumping result to json 10450 1726773074.72497: done dumping result, returning 10450 1726773074.72502: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-885f-bbcf-000000000304] 10450 1726773074.72506: sending task result for task 0affffe7-6841-885f-bbcf-000000000304 10450 1726773074.72526: done sending task result for task 0affffe7-6841-885f-bbcf-000000000304 10450 1726773074.72528: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773074.72689: no more pending results, returning what we have 8240 1726773074.72692: results queue empty 8240 1726773074.72693: checking for any_errors_fatal 8240 1726773074.72699: done checking for any_errors_fatal 8240 1726773074.72700: checking for max_fail_percentage 8240 1726773074.72701: done checking for max_fail_percentage 8240 1726773074.72702: checking to see if all hosts have failed and the running result is not ok 8240 1726773074.72703: done checking to see if all hosts have failed 8240 1726773074.72703: getting the remaining hosts for this loop 8240 1726773074.72704: done getting the remaining hosts for this loop 8240 1726773074.72708: getting the next task for host managed_node2 8240 1726773074.72715: done getting next task for host managed_node2 8240 1726773074.72718: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8240 1726773074.72720: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773074.72736: getting variables 8240 1726773074.72738: in VariableManager get_vars() 8240 1726773074.72775: Calling all_inventory to load vars for managed_node2 8240 1726773074.72778: Calling groups_inventory to load vars for managed_node2 8240 1726773074.72780: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773074.72791: Calling all_plugins_play to load vars for managed_node2 8240 1726773074.72794: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773074.72797: Calling groups_plugins_play to load vars for managed_node2 8240 1726773074.72904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773074.73058: done with get_vars() 8240 1726773074.73065: done getting variables 8240 1726773074.73112: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.035) 0:00:53.375 **** 8240 1726773074.73137: entering _queue_task() for managed_node2/fail 8240 1726773074.73302: worker is 1 (out of 1 available) 8240 1726773074.73316: exiting _queue_task() for managed_node2/fail 8240 1726773074.73329: done queuing things up, now waiting for results queue to drain 8240 1726773074.73331: waiting for pending results... 10451 1726773074.73447: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10451 1726773074.73556: in run() - task 0affffe7-6841-885f-bbcf-000000000305 10451 1726773074.73572: variable 'ansible_search_path' from source: unknown 10451 1726773074.73577: variable 'ansible_search_path' from source: unknown 10451 1726773074.73604: calling self._execute() 10451 1726773074.73666: variable 'ansible_host' from source: host vars for 'managed_node2' 10451 1726773074.73675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10451 1726773074.73684: variable 'omit' from source: magic vars 10451 1726773074.74011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10451 1726773074.75652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10451 1726773074.75700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10451 1726773074.75728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10451 1726773074.75764: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10451 1726773074.75787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10451 1726773074.75841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10451 1726773074.75862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10451 1726773074.75881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10451 1726773074.75910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10451 1726773074.75922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10451 1726773074.75994: variable '__kernel_settings_is_transactional' from source: set_fact 10451 1726773074.76009: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10451 1726773074.76012: when evaluation is False, skipping this task 10451 1726773074.76014: _execute() done 10451 1726773074.76016: dumping result to json 10451 1726773074.76018: done dumping result, returning 10451 1726773074.76022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-885f-bbcf-000000000305] 10451 1726773074.76026: sending task result for task 0affffe7-6841-885f-bbcf-000000000305 10451 1726773074.76045: done sending task result for task 0affffe7-6841-885f-bbcf-000000000305 skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773074.76191: no more pending results, returning what we have 8240 1726773074.76195: results queue empty 8240 1726773074.76195: checking for any_errors_fatal 8240 1726773074.76201: done checking for any_errors_fatal 8240 1726773074.76202: checking for max_fail_percentage 8240 1726773074.76203: done checking for max_fail_percentage 8240 1726773074.76204: checking to see if all hosts have failed and the running result is not ok 8240 1726773074.76205: done checking to see if all hosts have failed 8240 1726773074.76205: getting the remaining hosts for this loop 8240 1726773074.76207: done getting the remaining hosts for this loop 8240 1726773074.76210: getting the next task for host managed_node2 8240 1726773074.76221: done getting next task for host managed_node2 8240 1726773074.76224: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8240 1726773074.76226: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773074.76241: getting variables 8240 1726773074.76243: in VariableManager get_vars() 8240 1726773074.76278: Calling all_inventory to load vars for managed_node2 8240 1726773074.76280: Calling groups_inventory to load vars for managed_node2 8240 1726773074.76282: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773074.76293: Calling all_plugins_play to load vars for managed_node2 8240 1726773074.76296: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773074.76299: Calling groups_plugins_play to load vars for managed_node2 10451 1726773074.76047: WORKER PROCESS EXITING 8240 1726773074.76420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773074.76542: done with get_vars() 8240 1726773074.76549: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:14 -0400 (0:00:00.034) 0:00:53.409 **** 8240 1726773074.76613: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773074.76773: worker is 1 (out of 1 available) 8240 1726773074.76789: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773074.76802: done queuing things up, now waiting for results queue to drain 8240 1726773074.76804: waiting for pending results... 10452 1726773074.76917: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10452 1726773074.77019: in run() - task 0affffe7-6841-885f-bbcf-000000000307 10452 1726773074.77036: variable 'ansible_search_path' from source: unknown 10452 1726773074.77040: variable 'ansible_search_path' from source: unknown 10452 1726773074.77066: calling self._execute() 10452 1726773074.77131: variable 'ansible_host' from source: host vars for 'managed_node2' 10452 1726773074.77140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10452 1726773074.77149: variable 'omit' from source: magic vars 10452 1726773074.77220: variable 'omit' from source: magic vars 10452 1726773074.77255: variable 'omit' from source: magic vars 10452 1726773074.77277: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10452 1726773074.77481: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10452 1726773074.77537: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10452 1726773074.77566: variable 'omit' from source: magic vars 10452 1726773074.77598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10452 1726773074.77620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10452 1726773074.77635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10452 1726773074.77694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10452 1726773074.77704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10452 1726773074.77724: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10452 1726773074.77728: variable 'ansible_host' from source: host vars for 'managed_node2' 10452 1726773074.77730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10452 1726773074.77810: Set connection var ansible_pipelining to False 10452 1726773074.77818: Set connection var ansible_timeout to 10 10452 1726773074.77825: Set connection var ansible_module_compression to ZIP_DEFLATED 10452 1726773074.77828: Set connection var ansible_shell_type to sh 10452 1726773074.77833: Set connection var ansible_shell_executable to /bin/sh 10452 1726773074.77838: Set connection var ansible_connection to ssh 10452 1726773074.77854: variable 'ansible_shell_executable' from source: unknown 10452 1726773074.77857: variable 'ansible_connection' from source: unknown 10452 1726773074.77861: variable 'ansible_module_compression' from source: unknown 10452 1726773074.77864: variable 'ansible_shell_type' from source: unknown 10452 1726773074.77868: variable 'ansible_shell_executable' from source: unknown 10452 1726773074.77874: variable 'ansible_host' from source: host vars for 'managed_node2' 10452 1726773074.77878: variable 'ansible_pipelining' from source: unknown 10452 1726773074.77881: variable 'ansible_timeout' from source: unknown 10452 1726773074.77887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10452 1726773074.78008: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10452 1726773074.78019: variable 'omit' from source: magic vars 10452 1726773074.78024: starting attempt loop 10452 1726773074.78028: running the handler 10452 1726773074.78039: _low_level_execute_command(): starting 10452 1726773074.78046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10452 1726773074.80402: stdout chunk (state=2): >>>/root <<< 10452 1726773074.80527: stderr chunk (state=3): >>><<< 10452 1726773074.80536: stdout chunk (state=3): >>><<< 10452 1726773074.80556: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10452 1726773074.80570: _low_level_execute_command(): starting 10452 1726773074.80578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530 `" && echo ansible-tmp-1726773074.805638-10452-195230133583530="` echo /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530 `" ) && sleep 0' 10452 1726773074.83200: stdout chunk (state=2): >>>ansible-tmp-1726773074.805638-10452-195230133583530=/root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530 <<< 10452 1726773074.83411: stderr chunk (state=3): >>><<< 10452 1726773074.83418: stdout chunk (state=3): >>><<< 10452 1726773074.83435: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773074.805638-10452-195230133583530=/root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530 , stderr= 10452 1726773074.83472: variable 'ansible_module_compression' from source: unknown 10452 1726773074.83505: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10452 1726773074.83536: variable 'ansible_facts' from source: unknown 10452 1726773074.83602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/AnsiballZ_kernel_settings_get_config.py 10452 1726773074.83699: Sending initial data 10452 1726773074.83707: Sent initial data (173 bytes) 10452 1726773074.86287: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp6sfgj60c /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/AnsiballZ_kernel_settings_get_config.py <<< 10452 1726773074.87353: stderr chunk (state=3): >>><<< 10452 1726773074.87360: stdout chunk (state=3): >>><<< 10452 1726773074.87383: done transferring module to remote 10452 1726773074.87396: _low_level_execute_command(): starting 10452 1726773074.87401: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/ /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10452 1726773074.89766: stderr chunk (state=2): >>><<< 10452 1726773074.89775: stdout chunk (state=2): >>><<< 10452 1726773074.89790: _low_level_execute_command() done: rc=0, stdout=, stderr= 10452 1726773074.89794: _low_level_execute_command(): starting 10452 1726773074.89799: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10452 1726773075.05651: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10452 1726773075.06742: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10452 1726773075.06794: stderr chunk (state=3): >>><<< 10452 1726773075.06802: stdout chunk (state=3): >>><<< 10452 1726773075.06819: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 10452 1726773075.06849: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10452 1726773075.06860: _low_level_execute_command(): starting 10452 1726773075.06866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773074.805638-10452-195230133583530/ > /dev/null 2>&1 && sleep 0' 10452 1726773075.09323: stderr chunk (state=2): >>><<< 10452 1726773075.09333: stdout chunk (state=2): >>><<< 10452 1726773075.09348: _low_level_execute_command() done: rc=0, stdout=, stderr= 10452 1726773075.09356: handler run complete 10452 1726773075.09374: attempt loop complete, returning result 10452 1726773075.09379: _execute() done 10452 1726773075.09382: dumping result to json 10452 1726773075.09388: done dumping result, returning 10452 1726773075.09396: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-885f-bbcf-000000000307] 10452 1726773075.09402: sending task result for task 0affffe7-6841-885f-bbcf-000000000307 10452 1726773075.09431: done sending task result for task 0affffe7-6841-885f-bbcf-000000000307 10452 1726773075.09435: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8240 1726773075.09587: no more pending results, returning what we have 8240 1726773075.09592: results queue empty 8240 1726773075.09593: checking for any_errors_fatal 8240 1726773075.09598: done checking for any_errors_fatal 8240 1726773075.09599: checking for max_fail_percentage 8240 1726773075.09600: done checking for max_fail_percentage 8240 1726773075.09601: checking to see if all hosts have failed and the running result is not ok 8240 1726773075.09602: done checking to see if all hosts have failed 8240 1726773075.09603: getting the remaining hosts for this loop 8240 1726773075.09604: done getting the remaining hosts for this loop 8240 1726773075.09607: getting the next task for host managed_node2 8240 1726773075.09613: done getting next task for host managed_node2 8240 1726773075.09616: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8240 1726773075.09619: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773075.09629: getting variables 8240 1726773075.09630: in VariableManager get_vars() 8240 1726773075.09663: Calling all_inventory to load vars for managed_node2 8240 1726773075.09666: Calling groups_inventory to load vars for managed_node2 8240 1726773075.09668: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773075.09680: Calling all_plugins_play to load vars for managed_node2 8240 1726773075.09683: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773075.09686: Calling groups_plugins_play to load vars for managed_node2 8240 1726773075.09830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773075.09952: done with get_vars() 8240 1726773075.09960: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.334) 0:00:53.744 **** 8240 1726773075.10032: entering _queue_task() for managed_node2/stat 8240 1726773075.10201: worker is 1 (out of 1 available) 8240 1726773075.10212: exiting _queue_task() for managed_node2/stat 8240 1726773075.10225: done queuing things up, now waiting for results queue to drain 8240 1726773075.10226: waiting for pending results... 10463 1726773075.10346: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10463 1726773075.10458: in run() - task 0affffe7-6841-885f-bbcf-000000000308 10463 1726773075.10476: variable 'ansible_search_path' from source: unknown 10463 1726773075.10480: variable 'ansible_search_path' from source: unknown 10463 1726773075.10518: variable '__prof_from_conf' from source: task vars 10463 1726773075.10749: variable '__prof_from_conf' from source: task vars 10463 1726773075.10884: variable '__data' from source: task vars 10463 1726773075.10937: variable '__kernel_settings_register_tuned_main' from source: set_fact 10463 1726773075.11074: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10463 1726773075.11086: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10463 1726773075.11128: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10463 1726773075.11145: variable 'omit' from source: magic vars 10463 1726773075.11230: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.11240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.11248: variable 'omit' from source: magic vars 10463 1726773075.11422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10463 1726773075.13117: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10463 1726773075.13173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10463 1726773075.13203: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10463 1726773075.13230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10463 1726773075.13250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10463 1726773075.13307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10463 1726773075.13328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10463 1726773075.13347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10463 1726773075.13377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10463 1726773075.13391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10463 1726773075.13456: variable 'item' from source: unknown 10463 1726773075.13470: Evaluated conditional (item | length > 0): False 10463 1726773075.13477: when evaluation is False, skipping this task 10463 1726773075.13502: variable 'item' from source: unknown 10463 1726773075.13550: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10463 1726773075.13628: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.13638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.13647: variable 'omit' from source: magic vars 10463 1726773075.13769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10463 1726773075.13792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10463 1726773075.13810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10463 1726773075.13837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10463 1726773075.13850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10463 1726773075.13907: variable 'item' from source: unknown 10463 1726773075.13916: Evaluated conditional (item | length > 0): True 10463 1726773075.13922: variable 'omit' from source: magic vars 10463 1726773075.13952: variable 'omit' from source: magic vars 10463 1726773075.13986: variable 'item' from source: unknown 10463 1726773075.14030: variable 'item' from source: unknown 10463 1726773075.14043: variable 'omit' from source: magic vars 10463 1726773075.14064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10463 1726773075.14088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10463 1726773075.14104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10463 1726773075.14118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10463 1726773075.14127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10463 1726773075.14150: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10463 1726773075.14155: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.14159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.14228: Set connection var ansible_pipelining to False 10463 1726773075.14235: Set connection var ansible_timeout to 10 10463 1726773075.14243: Set connection var ansible_module_compression to ZIP_DEFLATED 10463 1726773075.14246: Set connection var ansible_shell_type to sh 10463 1726773075.14251: Set connection var ansible_shell_executable to /bin/sh 10463 1726773075.14256: Set connection var ansible_connection to ssh 10463 1726773075.14273: variable 'ansible_shell_executable' from source: unknown 10463 1726773075.14278: variable 'ansible_connection' from source: unknown 10463 1726773075.14281: variable 'ansible_module_compression' from source: unknown 10463 1726773075.14284: variable 'ansible_shell_type' from source: unknown 10463 1726773075.14289: variable 'ansible_shell_executable' from source: unknown 10463 1726773075.14293: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.14297: variable 'ansible_pipelining' from source: unknown 10463 1726773075.14300: variable 'ansible_timeout' from source: unknown 10463 1726773075.14305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.14415: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10463 1726773075.14425: variable 'omit' from source: magic vars 10463 1726773075.14431: starting attempt loop 10463 1726773075.14435: running the handler 10463 1726773075.14446: _low_level_execute_command(): starting 10463 1726773075.14452: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10463 1726773075.16792: stdout chunk (state=2): >>>/root <<< 10463 1726773075.16984: stderr chunk (state=3): >>><<< 10463 1726773075.16992: stdout chunk (state=3): >>><<< 10463 1726773075.17010: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10463 1726773075.17022: _low_level_execute_command(): starting 10463 1726773075.17028: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916 `" && echo ansible-tmp-1726773075.1701763-10463-216588393412916="` echo /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916 `" ) && sleep 0' 10463 1726773075.19612: stdout chunk (state=2): >>>ansible-tmp-1726773075.1701763-10463-216588393412916=/root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916 <<< 10463 1726773075.19742: stderr chunk (state=3): >>><<< 10463 1726773075.19749: stdout chunk (state=3): >>><<< 10463 1726773075.19769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.1701763-10463-216588393412916=/root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916 , stderr= 10463 1726773075.19813: variable 'ansible_module_compression' from source: unknown 10463 1726773075.19856: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10463 1726773075.19888: variable 'ansible_facts' from source: unknown 10463 1726773075.19952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/AnsiballZ_stat.py 10463 1726773075.20056: Sending initial data 10463 1726773075.20063: Sent initial data (152 bytes) 10463 1726773075.22601: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpknhc_k6l /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/AnsiballZ_stat.py <<< 10463 1726773075.23697: stderr chunk (state=3): >>><<< 10463 1726773075.23705: stdout chunk (state=3): >>><<< 10463 1726773075.23723: done transferring module to remote 10463 1726773075.23734: _low_level_execute_command(): starting 10463 1726773075.23739: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/ /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/AnsiballZ_stat.py && sleep 0' 10463 1726773075.26073: stderr chunk (state=2): >>><<< 10463 1726773075.26080: stdout chunk (state=2): >>><<< 10463 1726773075.26094: _low_level_execute_command() done: rc=0, stdout=, stderr= 10463 1726773075.26098: _low_level_execute_command(): starting 10463 1726773075.26103: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/AnsiballZ_stat.py && sleep 0' 10463 1726773075.41099: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10463 1726773075.42251: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10463 1726773075.42261: stdout chunk (state=3): >>><<< 10463 1726773075.42270: stderr chunk (state=3): >>><<< 10463 1726773075.42282: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 10463 1726773075.42307: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10463 1726773075.42318: _low_level_execute_command(): starting 10463 1726773075.42322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.1701763-10463-216588393412916/ > /dev/null 2>&1 && sleep 0' 10463 1726773075.45005: stderr chunk (state=2): >>><<< 10463 1726773075.45016: stdout chunk (state=2): >>><<< 10463 1726773075.45034: _low_level_execute_command() done: rc=0, stdout=, stderr= 10463 1726773075.45042: handler run complete 10463 1726773075.45063: attempt loop complete, returning result 10463 1726773075.45083: variable 'item' from source: unknown 10463 1726773075.45169: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10463 1726773075.45268: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.45279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.45291: variable 'omit' from source: magic vars 10463 1726773075.45444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10463 1726773075.45471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10463 1726773075.45498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10463 1726773075.45538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10463 1726773075.45555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10463 1726773075.45636: variable 'item' from source: unknown 10463 1726773075.45646: Evaluated conditional (item | length > 0): True 10463 1726773075.45651: variable 'omit' from source: magic vars 10463 1726773075.45667: variable 'omit' from source: magic vars 10463 1726773075.45710: variable 'item' from source: unknown 10463 1726773075.45772: variable 'item' from source: unknown 10463 1726773075.45790: variable 'omit' from source: magic vars 10463 1726773075.45813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10463 1726773075.45823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10463 1726773075.45830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10463 1726773075.45844: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10463 1726773075.45849: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.45853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.45929: Set connection var ansible_pipelining to False 10463 1726773075.45938: Set connection var ansible_timeout to 10 10463 1726773075.45947: Set connection var ansible_module_compression to ZIP_DEFLATED 10463 1726773075.45950: Set connection var ansible_shell_type to sh 10463 1726773075.45956: Set connection var ansible_shell_executable to /bin/sh 10463 1726773075.45961: Set connection var ansible_connection to ssh 10463 1726773075.45978: variable 'ansible_shell_executable' from source: unknown 10463 1726773075.45983: variable 'ansible_connection' from source: unknown 10463 1726773075.45989: variable 'ansible_module_compression' from source: unknown 10463 1726773075.45992: variable 'ansible_shell_type' from source: unknown 10463 1726773075.45995: variable 'ansible_shell_executable' from source: unknown 10463 1726773075.45998: variable 'ansible_host' from source: host vars for 'managed_node2' 10463 1726773075.46001: variable 'ansible_pipelining' from source: unknown 10463 1726773075.46004: variable 'ansible_timeout' from source: unknown 10463 1726773075.46007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10463 1726773075.46253: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10463 1726773075.46263: variable 'omit' from source: magic vars 10463 1726773075.46267: starting attempt loop 10463 1726773075.46270: running the handler 10463 1726773075.46276: _low_level_execute_command(): starting 10463 1726773075.46280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10463 1726773075.48728: stdout chunk (state=2): >>>/root <<< 10463 1726773075.48873: stderr chunk (state=3): >>><<< 10463 1726773075.48881: stdout chunk (state=3): >>><<< 10463 1726773075.48900: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10463 1726773075.48912: _low_level_execute_command(): starting 10463 1726773075.48919: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270 `" && echo ansible-tmp-1726773075.4890883-10463-179295020512270="` echo /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270 `" ) && sleep 0' 10463 1726773075.52008: stdout chunk (state=2): >>>ansible-tmp-1726773075.4890883-10463-179295020512270=/root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270 <<< 10463 1726773075.52132: stderr chunk (state=3): >>><<< 10463 1726773075.52138: stdout chunk (state=3): >>><<< 10463 1726773075.52151: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.4890883-10463-179295020512270=/root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270 , stderr= 10463 1726773075.52181: variable 'ansible_module_compression' from source: unknown 10463 1726773075.52231: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10463 1726773075.52250: variable 'ansible_facts' from source: unknown 10463 1726773075.52329: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/AnsiballZ_stat.py 10463 1726773075.52605: Sending initial data 10463 1726773075.52612: Sent initial data (152 bytes) 10463 1726773075.55094: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpyy57uvvy /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/AnsiballZ_stat.py <<< 10463 1726773075.56656: stderr chunk (state=3): >>><<< 10463 1726773075.56663: stdout chunk (state=3): >>><<< 10463 1726773075.56679: done transferring module to remote 10463 1726773075.56689: _low_level_execute_command(): starting 10463 1726773075.56693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/ /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/AnsiballZ_stat.py && sleep 0' 10463 1726773075.59078: stderr chunk (state=2): >>><<< 10463 1726773075.59091: stdout chunk (state=2): >>><<< 10463 1726773075.59109: _low_level_execute_command() done: rc=0, stdout=, stderr= 10463 1726773075.59113: _low_level_execute_command(): starting 10463 1726773075.59118: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/AnsiballZ_stat.py && sleep 0' 10463 1726773075.76407: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10463 1726773075.77565: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10463 1726773075.77616: stderr chunk (state=3): >>><<< 10463 1726773075.77623: stdout chunk (state=3): >>><<< 10463 1726773075.77637: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 10463 1726773075.77667: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10463 1726773075.77675: _low_level_execute_command(): starting 10463 1726773075.77679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.4890883-10463-179295020512270/ > /dev/null 2>&1 && sleep 0' 10463 1726773075.80109: stderr chunk (state=2): >>><<< 10463 1726773075.80118: stdout chunk (state=2): >>><<< 10463 1726773075.80144: _low_level_execute_command() done: rc=0, stdout=, stderr= 10463 1726773075.80150: handler run complete 10463 1726773075.80178: attempt loop complete, returning result 10463 1726773075.80195: variable 'item' from source: unknown 10463 1726773075.80252: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773042.2211215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773040.2991023, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773040.2991023, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10463 1726773075.80294: dumping result to json 10463 1726773075.80303: done dumping result, returning 10463 1726773075.80310: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-885f-bbcf-000000000308] 10463 1726773075.80314: sending task result for task 0affffe7-6841-885f-bbcf-000000000308 10463 1726773075.80342: done sending task result for task 0affffe7-6841-885f-bbcf-000000000308 10463 1726773075.80345: WORKER PROCESS EXITING 8240 1726773075.80633: no more pending results, returning what we have 8240 1726773075.80635: results queue empty 8240 1726773075.80636: checking for any_errors_fatal 8240 1726773075.80640: done checking for any_errors_fatal 8240 1726773075.80640: checking for max_fail_percentage 8240 1726773075.80641: done checking for max_fail_percentage 8240 1726773075.80642: checking to see if all hosts have failed and the running result is not ok 8240 1726773075.80643: done checking to see if all hosts have failed 8240 1726773075.80643: getting the remaining hosts for this loop 8240 1726773075.80644: done getting the remaining hosts for this loop 8240 1726773075.80646: getting the next task for host managed_node2 8240 1726773075.80651: done getting next task for host managed_node2 8240 1726773075.80653: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8240 1726773075.80654: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773075.80661: getting variables 8240 1726773075.80662: in VariableManager get_vars() 8240 1726773075.80692: Calling all_inventory to load vars for managed_node2 8240 1726773075.80694: Calling groups_inventory to load vars for managed_node2 8240 1726773075.80696: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773075.80704: Calling all_plugins_play to load vars for managed_node2 8240 1726773075.80706: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773075.80707: Calling groups_plugins_play to load vars for managed_node2 8240 1726773075.80815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773075.80935: done with get_vars() 8240 1726773075.80942: done getting variables 8240 1726773075.80989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.709) 0:00:54.453 **** 8240 1726773075.81010: entering _queue_task() for managed_node2/set_fact 8240 1726773075.81174: worker is 1 (out of 1 available) 8240 1726773075.81190: exiting _queue_task() for managed_node2/set_fact 8240 1726773075.81203: done queuing things up, now waiting for results queue to drain 8240 1726773075.81205: waiting for pending results... 10490 1726773075.81321: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10490 1726773075.81435: in run() - task 0affffe7-6841-885f-bbcf-000000000309 10490 1726773075.81451: variable 'ansible_search_path' from source: unknown 10490 1726773075.81455: variable 'ansible_search_path' from source: unknown 10490 1726773075.81482: calling self._execute() 10490 1726773075.81548: variable 'ansible_host' from source: host vars for 'managed_node2' 10490 1726773075.81556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10490 1726773075.81565: variable 'omit' from source: magic vars 10490 1726773075.81868: variable 'omit' from source: magic vars 10490 1726773075.81905: variable 'omit' from source: magic vars 10490 1726773075.82289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10490 1726773075.84314: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10490 1726773075.84397: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10490 1726773075.84434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10490 1726773075.84468: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10490 1726773075.84495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10490 1726773075.84569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10490 1726773075.84601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10490 1726773075.84627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10490 1726773075.84666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10490 1726773075.84681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10490 1726773075.84728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10490 1726773075.84753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10490 1726773075.84778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10490 1726773075.84819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10490 1726773075.84834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10490 1726773075.84890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10490 1726773075.84914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10490 1726773075.84939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10490 1726773075.84976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10490 1726773075.84993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10490 1726773075.85221: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10490 1726773075.85312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10490 1726773075.85463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10490 1726773075.85544: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10490 1726773075.85574: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10490 1726773075.85604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10490 1726773075.85643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10490 1726773075.85664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10490 1726773075.85691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10490 1726773075.85717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10490 1726773075.85767: variable 'omit' from source: magic vars 10490 1726773075.85795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10490 1726773075.85821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10490 1726773075.85839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10490 1726773075.85857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10490 1726773075.85867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10490 1726773075.85898: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10490 1726773075.85904: variable 'ansible_host' from source: host vars for 'managed_node2' 10490 1726773075.85908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10490 1726773075.86005: Set connection var ansible_pipelining to False 10490 1726773075.86014: Set connection var ansible_timeout to 10 10490 1726773075.86023: Set connection var ansible_module_compression to ZIP_DEFLATED 10490 1726773075.86027: Set connection var ansible_shell_type to sh 10490 1726773075.86032: Set connection var ansible_shell_executable to /bin/sh 10490 1726773075.86037: Set connection var ansible_connection to ssh 10490 1726773075.86058: variable 'ansible_shell_executable' from source: unknown 10490 1726773075.86063: variable 'ansible_connection' from source: unknown 10490 1726773075.86066: variable 'ansible_module_compression' from source: unknown 10490 1726773075.86069: variable 'ansible_shell_type' from source: unknown 10490 1726773075.86072: variable 'ansible_shell_executable' from source: unknown 10490 1726773075.86074: variable 'ansible_host' from source: host vars for 'managed_node2' 10490 1726773075.86078: variable 'ansible_pipelining' from source: unknown 10490 1726773075.86081: variable 'ansible_timeout' from source: unknown 10490 1726773075.86087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10490 1726773075.86171: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10490 1726773075.86189: variable 'omit' from source: magic vars 10490 1726773075.86196: starting attempt loop 10490 1726773075.86200: running the handler 10490 1726773075.86211: handler run complete 10490 1726773075.86221: attempt loop complete, returning result 10490 1726773075.86224: _execute() done 10490 1726773075.86227: dumping result to json 10490 1726773075.86230: done dumping result, returning 10490 1726773075.86237: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-885f-bbcf-000000000309] 10490 1726773075.86243: sending task result for task 0affffe7-6841-885f-bbcf-000000000309 10490 1726773075.86267: done sending task result for task 0affffe7-6841-885f-bbcf-000000000309 10490 1726773075.86270: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8240 1726773075.86689: no more pending results, returning what we have 8240 1726773075.86692: results queue empty 8240 1726773075.86693: checking for any_errors_fatal 8240 1726773075.86701: done checking for any_errors_fatal 8240 1726773075.86702: checking for max_fail_percentage 8240 1726773075.86703: done checking for max_fail_percentage 8240 1726773075.86703: checking to see if all hosts have failed and the running result is not ok 8240 1726773075.86704: done checking to see if all hosts have failed 8240 1726773075.86705: getting the remaining hosts for this loop 8240 1726773075.86706: done getting the remaining hosts for this loop 8240 1726773075.86709: getting the next task for host managed_node2 8240 1726773075.86715: done getting next task for host managed_node2 8240 1726773075.86719: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8240 1726773075.86721: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773075.86731: getting variables 8240 1726773075.86733: in VariableManager get_vars() 8240 1726773075.86766: Calling all_inventory to load vars for managed_node2 8240 1726773075.86768: Calling groups_inventory to load vars for managed_node2 8240 1726773075.86770: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773075.86779: Calling all_plugins_play to load vars for managed_node2 8240 1726773075.86782: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773075.86786: Calling groups_plugins_play to load vars for managed_node2 8240 1726773075.87162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773075.87347: done with get_vars() 8240 1726773075.87357: done getting variables 8240 1726773075.87417: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:15 -0400 (0:00:00.064) 0:00:54.518 **** 8240 1726773075.87448: entering _queue_task() for managed_node2/service 8240 1726773075.87646: worker is 1 (out of 1 available) 8240 1726773075.87658: exiting _queue_task() for managed_node2/service 8240 1726773075.87672: done queuing things up, now waiting for results queue to drain 8240 1726773075.87673: waiting for pending results... 10492 1726773075.87908: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10492 1726773075.88053: in run() - task 0affffe7-6841-885f-bbcf-00000000030a 10492 1726773075.88072: variable 'ansible_search_path' from source: unknown 10492 1726773075.88077: variable 'ansible_search_path' from source: unknown 10492 1726773075.88116: variable '__kernel_settings_services' from source: include_vars 10492 1726773075.88408: variable '__kernel_settings_services' from source: include_vars 10492 1726773075.88476: variable 'omit' from source: magic vars 10492 1726773075.88588: variable 'ansible_host' from source: host vars for 'managed_node2' 10492 1726773075.88602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10492 1726773075.88614: variable 'omit' from source: magic vars 10492 1726773075.88683: variable 'omit' from source: magic vars 10492 1726773075.88726: variable 'omit' from source: magic vars 10492 1726773075.88771: variable 'item' from source: unknown 10492 1726773075.88900: variable 'item' from source: unknown 10492 1726773075.88925: variable 'omit' from source: magic vars 10492 1726773075.88960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10492 1726773075.88994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10492 1726773075.89016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10492 1726773075.89034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10492 1726773075.89046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10492 1726773075.89076: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10492 1726773075.89082: variable 'ansible_host' from source: host vars for 'managed_node2' 10492 1726773075.89088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10492 1726773075.89183: Set connection var ansible_pipelining to False 10492 1726773075.89194: Set connection var ansible_timeout to 10 10492 1726773075.89202: Set connection var ansible_module_compression to ZIP_DEFLATED 10492 1726773075.89206: Set connection var ansible_shell_type to sh 10492 1726773075.89211: Set connection var ansible_shell_executable to /bin/sh 10492 1726773075.89215: Set connection var ansible_connection to ssh 10492 1726773075.89232: variable 'ansible_shell_executable' from source: unknown 10492 1726773075.89236: variable 'ansible_connection' from source: unknown 10492 1726773075.89240: variable 'ansible_module_compression' from source: unknown 10492 1726773075.89243: variable 'ansible_shell_type' from source: unknown 10492 1726773075.89246: variable 'ansible_shell_executable' from source: unknown 10492 1726773075.89248: variable 'ansible_host' from source: host vars for 'managed_node2' 10492 1726773075.89252: variable 'ansible_pipelining' from source: unknown 10492 1726773075.89255: variable 'ansible_timeout' from source: unknown 10492 1726773075.89259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10492 1726773075.89379: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10492 1726773075.89394: variable 'omit' from source: magic vars 10492 1726773075.89400: starting attempt loop 10492 1726773075.89404: running the handler 10492 1726773075.89486: variable 'ansible_facts' from source: unknown 10492 1726773075.89602: _low_level_execute_command(): starting 10492 1726773075.89611: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10492 1726773075.92182: stdout chunk (state=2): >>>/root <<< 10492 1726773075.92407: stderr chunk (state=3): >>><<< 10492 1726773075.92415: stdout chunk (state=3): >>><<< 10492 1726773075.92437: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10492 1726773075.92452: _low_level_execute_command(): starting 10492 1726773075.92459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144 `" && echo ansible-tmp-1726773075.9244573-10492-256746157820144="` echo /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144 `" ) && sleep 0' 10492 1726773075.95309: stdout chunk (state=2): >>>ansible-tmp-1726773075.9244573-10492-256746157820144=/root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144 <<< 10492 1726773075.95635: stderr chunk (state=3): >>><<< 10492 1726773075.95644: stdout chunk (state=3): >>><<< 10492 1726773075.95663: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773075.9244573-10492-256746157820144=/root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144 , stderr= 10492 1726773075.95700: variable 'ansible_module_compression' from source: unknown 10492 1726773075.95758: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10492 1726773075.95824: variable 'ansible_facts' from source: unknown 10492 1726773075.96061: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/AnsiballZ_systemd.py 10492 1726773075.96221: Sending initial data 10492 1726773075.96228: Sent initial data (155 bytes) 10492 1726773075.98959: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpo_774y12 /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/AnsiballZ_systemd.py <<< 10492 1726773076.01462: stderr chunk (state=3): >>><<< 10492 1726773076.01472: stdout chunk (state=3): >>><<< 10492 1726773076.01500: done transferring module to remote 10492 1726773076.01515: _low_level_execute_command(): starting 10492 1726773076.01521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/ /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/AnsiballZ_systemd.py && sleep 0' 10492 1726773076.03908: stderr chunk (state=2): >>><<< 10492 1726773076.03916: stdout chunk (state=2): >>><<< 10492 1726773076.03930: _low_level_execute_command() done: rc=0, stdout=, stderr= 10492 1726773076.03934: _low_level_execute_command(): starting 10492 1726773076.03939: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/AnsiballZ_systemd.py && sleep 0' 10492 1726773076.32768: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18403328", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 10492 1726773076.32806: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChange<<< 10492 1726773076.32818: stdout chunk (state=3): >>>TimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10492 1726773076.34517: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10492 1726773076.34526: stdout chunk (state=3): >>><<< 10492 1726773076.34535: stderr chunk (state=3): >>><<< 10492 1726773076.34553: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18403328", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10492 1726773076.34727: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10492 1726773076.34751: _low_level_execute_command(): starting 10492 1726773076.34758: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773075.9244573-10492-256746157820144/ > /dev/null 2>&1 && sleep 0' 10492 1726773076.37206: stderr chunk (state=2): >>><<< 10492 1726773076.37213: stdout chunk (state=2): >>><<< 10492 1726773076.37226: _low_level_execute_command() done: rc=0, stdout=, stderr= 10492 1726773076.37233: handler run complete 10492 1726773076.37266: attempt loop complete, returning result 10492 1726773076.37284: variable 'item' from source: unknown 10492 1726773076.37345: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "18403328", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "WatchdogUSec": "0" } } 10492 1726773076.37439: dumping result to json 10492 1726773076.37456: done dumping result, returning 10492 1726773076.37464: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-00000000030a] 10492 1726773076.37470: sending task result for task 0affffe7-6841-885f-bbcf-00000000030a 10492 1726773076.37579: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030a 10492 1726773076.37583: WORKER PROCESS EXITING 8240 1726773076.37914: no more pending results, returning what we have 8240 1726773076.37916: results queue empty 8240 1726773076.37917: checking for any_errors_fatal 8240 1726773076.37921: done checking for any_errors_fatal 8240 1726773076.37921: checking for max_fail_percentage 8240 1726773076.37922: done checking for max_fail_percentage 8240 1726773076.37923: checking to see if all hosts have failed and the running result is not ok 8240 1726773076.37923: done checking to see if all hosts have failed 8240 1726773076.37923: getting the remaining hosts for this loop 8240 1726773076.37924: done getting the remaining hosts for this loop 8240 1726773076.37927: getting the next task for host managed_node2 8240 1726773076.37931: done getting next task for host managed_node2 8240 1726773076.37933: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8240 1726773076.37935: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773076.37945: getting variables 8240 1726773076.37946: in VariableManager get_vars() 8240 1726773076.37969: Calling all_inventory to load vars for managed_node2 8240 1726773076.37971: Calling groups_inventory to load vars for managed_node2 8240 1726773076.37974: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773076.37981: Calling all_plugins_play to load vars for managed_node2 8240 1726773076.37983: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773076.37987: Calling groups_plugins_play to load vars for managed_node2 8240 1726773076.38103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773076.38222: done with get_vars() 8240 1726773076.38230: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:16 -0400 (0:00:00.508) 0:00:55.026 **** 8240 1726773076.38302: entering _queue_task() for managed_node2/file 8240 1726773076.38464: worker is 1 (out of 1 available) 8240 1726773076.38480: exiting _queue_task() for managed_node2/file 8240 1726773076.38495: done queuing things up, now waiting for results queue to drain 8240 1726773076.38497: waiting for pending results... 10516 1726773076.38621: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10516 1726773076.38740: in run() - task 0affffe7-6841-885f-bbcf-00000000030b 10516 1726773076.38756: variable 'ansible_search_path' from source: unknown 10516 1726773076.38760: variable 'ansible_search_path' from source: unknown 10516 1726773076.38789: calling self._execute() 10516 1726773076.38859: variable 'ansible_host' from source: host vars for 'managed_node2' 10516 1726773076.38869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10516 1726773076.38878: variable 'omit' from source: magic vars 10516 1726773076.38954: variable 'omit' from source: magic vars 10516 1726773076.38992: variable 'omit' from source: magic vars 10516 1726773076.39015: variable '__kernel_settings_profile_dir' from source: role '' all vars 10516 1726773076.39242: variable '__kernel_settings_profile_dir' from source: role '' all vars 10516 1726773076.39320: variable '__kernel_settings_profile_parent' from source: set_fact 10516 1726773076.39328: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10516 1726773076.39362: variable 'omit' from source: magic vars 10516 1726773076.39396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10516 1726773076.39423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10516 1726773076.39440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10516 1726773076.39454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10516 1726773076.39465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10516 1726773076.39493: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10516 1726773076.39499: variable 'ansible_host' from source: host vars for 'managed_node2' 10516 1726773076.39503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10516 1726773076.39571: Set connection var ansible_pipelining to False 10516 1726773076.39579: Set connection var ansible_timeout to 10 10516 1726773076.39589: Set connection var ansible_module_compression to ZIP_DEFLATED 10516 1726773076.39592: Set connection var ansible_shell_type to sh 10516 1726773076.39598: Set connection var ansible_shell_executable to /bin/sh 10516 1726773076.39603: Set connection var ansible_connection to ssh 10516 1726773076.39619: variable 'ansible_shell_executable' from source: unknown 10516 1726773076.39622: variable 'ansible_connection' from source: unknown 10516 1726773076.39626: variable 'ansible_module_compression' from source: unknown 10516 1726773076.39629: variable 'ansible_shell_type' from source: unknown 10516 1726773076.39632: variable 'ansible_shell_executable' from source: unknown 10516 1726773076.39636: variable 'ansible_host' from source: host vars for 'managed_node2' 10516 1726773076.39640: variable 'ansible_pipelining' from source: unknown 10516 1726773076.39643: variable 'ansible_timeout' from source: unknown 10516 1726773076.39647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10516 1726773076.39784: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10516 1726773076.39798: variable 'omit' from source: magic vars 10516 1726773076.39805: starting attempt loop 10516 1726773076.39809: running the handler 10516 1726773076.39821: _low_level_execute_command(): starting 10516 1726773076.39829: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10516 1726773076.42165: stdout chunk (state=2): >>>/root <<< 10516 1726773076.42288: stderr chunk (state=3): >>><<< 10516 1726773076.42295: stdout chunk (state=3): >>><<< 10516 1726773076.42314: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10516 1726773076.42327: _low_level_execute_command(): starting 10516 1726773076.42332: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218 `" && echo ansible-tmp-1726773076.4232144-10516-246439416113218="` echo /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218 `" ) && sleep 0' 10516 1726773076.45096: stdout chunk (state=2): >>>ansible-tmp-1726773076.4232144-10516-246439416113218=/root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218 <<< 10516 1726773076.45226: stderr chunk (state=3): >>><<< 10516 1726773076.45234: stdout chunk (state=3): >>><<< 10516 1726773076.45248: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773076.4232144-10516-246439416113218=/root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218 , stderr= 10516 1726773076.45289: variable 'ansible_module_compression' from source: unknown 10516 1726773076.45332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10516 1726773076.45361: variable 'ansible_facts' from source: unknown 10516 1726773076.45432: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/AnsiballZ_file.py 10516 1726773076.45535: Sending initial data 10516 1726773076.45543: Sent initial data (152 bytes) 10516 1726773076.48078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp1ws100js /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/AnsiballZ_file.py <<< 10516 1726773076.49223: stderr chunk (state=3): >>><<< 10516 1726773076.49232: stdout chunk (state=3): >>><<< 10516 1726773076.49252: done transferring module to remote 10516 1726773076.49263: _low_level_execute_command(): starting 10516 1726773076.49268: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/ /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/AnsiballZ_file.py && sleep 0' 10516 1726773076.51861: stderr chunk (state=2): >>><<< 10516 1726773076.51869: stdout chunk (state=2): >>><<< 10516 1726773076.51883: _low_level_execute_command() done: rc=0, stdout=, stderr= 10516 1726773076.51889: _low_level_execute_command(): starting 10516 1726773076.51895: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/AnsiballZ_file.py && sleep 0' 10516 1726773076.68784: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10516 1726773076.69970: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10516 1726773076.69980: stdout chunk (state=3): >>><<< 10516 1726773076.69993: stderr chunk (state=3): >>><<< 10516 1726773076.70010: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10516 1726773076.70056: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10516 1726773076.70068: _low_level_execute_command(): starting 10516 1726773076.70073: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773076.4232144-10516-246439416113218/ > /dev/null 2>&1 && sleep 0' 10516 1726773076.72688: stderr chunk (state=2): >>><<< 10516 1726773076.72696: stdout chunk (state=2): >>><<< 10516 1726773076.72712: _low_level_execute_command() done: rc=0, stdout=, stderr= 10516 1726773076.72719: handler run complete 10516 1726773076.72738: attempt loop complete, returning result 10516 1726773076.72741: _execute() done 10516 1726773076.72745: dumping result to json 10516 1726773076.72751: done dumping result, returning 10516 1726773076.72758: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-00000000030b] 10516 1726773076.72764: sending task result for task 0affffe7-6841-885f-bbcf-00000000030b 10516 1726773076.72800: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030b 10516 1726773076.72804: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8240 1726773076.72955: no more pending results, returning what we have 8240 1726773076.72958: results queue empty 8240 1726773076.72959: checking for any_errors_fatal 8240 1726773076.72976: done checking for any_errors_fatal 8240 1726773076.72977: checking for max_fail_percentage 8240 1726773076.72979: done checking for max_fail_percentage 8240 1726773076.72979: checking to see if all hosts have failed and the running result is not ok 8240 1726773076.72980: done checking to see if all hosts have failed 8240 1726773076.72981: getting the remaining hosts for this loop 8240 1726773076.72982: done getting the remaining hosts for this loop 8240 1726773076.72987: getting the next task for host managed_node2 8240 1726773076.72993: done getting next task for host managed_node2 8240 1726773076.72997: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8240 1726773076.72999: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773076.73010: getting variables 8240 1726773076.73011: in VariableManager get_vars() 8240 1726773076.73044: Calling all_inventory to load vars for managed_node2 8240 1726773076.73046: Calling groups_inventory to load vars for managed_node2 8240 1726773076.73047: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773076.73055: Calling all_plugins_play to load vars for managed_node2 8240 1726773076.73057: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773076.73059: Calling groups_plugins_play to load vars for managed_node2 8240 1726773076.73169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773076.73296: done with get_vars() 8240 1726773076.73304: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:16 -0400 (0:00:00.350) 0:00:55.377 **** 8240 1726773076.73375: entering _queue_task() for managed_node2/slurp 8240 1726773076.73537: worker is 1 (out of 1 available) 8240 1726773076.73551: exiting _queue_task() for managed_node2/slurp 8240 1726773076.73564: done queuing things up, now waiting for results queue to drain 8240 1726773076.73566: waiting for pending results... 10532 1726773076.73691: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10532 1726773076.73806: in run() - task 0affffe7-6841-885f-bbcf-00000000030c 10532 1726773076.73822: variable 'ansible_search_path' from source: unknown 10532 1726773076.73826: variable 'ansible_search_path' from source: unknown 10532 1726773076.73861: calling self._execute() 10532 1726773076.73931: variable 'ansible_host' from source: host vars for 'managed_node2' 10532 1726773076.73939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10532 1726773076.73949: variable 'omit' from source: magic vars 10532 1726773076.74022: variable 'omit' from source: magic vars 10532 1726773076.74053: variable 'omit' from source: magic vars 10532 1726773076.74075: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10532 1726773076.74335: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10532 1726773076.74398: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10532 1726773076.74427: variable 'omit' from source: magic vars 10532 1726773076.74458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10532 1726773076.74488: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10532 1726773076.74509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10532 1726773076.74527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10532 1726773076.74540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10532 1726773076.74571: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10532 1726773076.74578: variable 'ansible_host' from source: host vars for 'managed_node2' 10532 1726773076.74583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10532 1726773076.74697: Set connection var ansible_pipelining to False 10532 1726773076.74706: Set connection var ansible_timeout to 10 10532 1726773076.74713: Set connection var ansible_module_compression to ZIP_DEFLATED 10532 1726773076.74716: Set connection var ansible_shell_type to sh 10532 1726773076.74721: Set connection var ansible_shell_executable to /bin/sh 10532 1726773076.74725: Set connection var ansible_connection to ssh 10532 1726773076.74744: variable 'ansible_shell_executable' from source: unknown 10532 1726773076.74749: variable 'ansible_connection' from source: unknown 10532 1726773076.74753: variable 'ansible_module_compression' from source: unknown 10532 1726773076.74755: variable 'ansible_shell_type' from source: unknown 10532 1726773076.74758: variable 'ansible_shell_executable' from source: unknown 10532 1726773076.74761: variable 'ansible_host' from source: host vars for 'managed_node2' 10532 1726773076.74764: variable 'ansible_pipelining' from source: unknown 10532 1726773076.74767: variable 'ansible_timeout' from source: unknown 10532 1726773076.74770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10532 1726773076.74945: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10532 1726773076.74957: variable 'omit' from source: magic vars 10532 1726773076.74965: starting attempt loop 10532 1726773076.74968: running the handler 10532 1726773076.74980: _low_level_execute_command(): starting 10532 1726773076.74988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10532 1726773076.77324: stdout chunk (state=2): >>>/root <<< 10532 1726773076.77440: stderr chunk (state=3): >>><<< 10532 1726773076.77446: stdout chunk (state=3): >>><<< 10532 1726773076.77463: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10532 1726773076.77477: _low_level_execute_command(): starting 10532 1726773076.77483: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808 `" && echo ansible-tmp-1726773076.7747004-10532-93442377995808="` echo /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808 `" ) && sleep 0' 10532 1726773076.80088: stdout chunk (state=2): >>>ansible-tmp-1726773076.7747004-10532-93442377995808=/root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808 <<< 10532 1726773076.80220: stderr chunk (state=3): >>><<< 10532 1726773076.80227: stdout chunk (state=3): >>><<< 10532 1726773076.80242: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773076.7747004-10532-93442377995808=/root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808 , stderr= 10532 1726773076.80281: variable 'ansible_module_compression' from source: unknown 10532 1726773076.80318: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10532 1726773076.80348: variable 'ansible_facts' from source: unknown 10532 1726773076.80422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/AnsiballZ_slurp.py 10532 1726773076.80604: Sending initial data 10532 1726773076.80612: Sent initial data (152 bytes) 10532 1726773076.83120: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpy7plncx8 /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/AnsiballZ_slurp.py <<< 10532 1726773076.84213: stderr chunk (state=3): >>><<< 10532 1726773076.84221: stdout chunk (state=3): >>><<< 10532 1726773076.84241: done transferring module to remote 10532 1726773076.84252: _low_level_execute_command(): starting 10532 1726773076.84257: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/ /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/AnsiballZ_slurp.py && sleep 0' 10532 1726773076.86647: stderr chunk (state=2): >>><<< 10532 1726773076.86657: stdout chunk (state=2): >>><<< 10532 1726773076.86674: _low_level_execute_command() done: rc=0, stdout=, stderr= 10532 1726773076.86679: _low_level_execute_command(): starting 10532 1726773076.86684: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/AnsiballZ_slurp.py && sleep 0' 10532 1726773077.01615: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10532 1726773077.02639: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10532 1726773077.02690: stderr chunk (state=3): >>><<< 10532 1726773077.02697: stdout chunk (state=3): >>><<< 10532 1726773077.02712: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 10532 1726773077.02735: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10532 1726773077.02748: _low_level_execute_command(): starting 10532 1726773077.02754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773076.7747004-10532-93442377995808/ > /dev/null 2>&1 && sleep 0' 10532 1726773077.05189: stderr chunk (state=2): >>><<< 10532 1726773077.05198: stdout chunk (state=2): >>><<< 10532 1726773077.05212: _low_level_execute_command() done: rc=0, stdout=, stderr= 10532 1726773077.05219: handler run complete 10532 1726773077.05233: attempt loop complete, returning result 10532 1726773077.05238: _execute() done 10532 1726773077.05241: dumping result to json 10532 1726773077.05246: done dumping result, returning 10532 1726773077.05253: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-885f-bbcf-00000000030c] 10532 1726773077.05260: sending task result for task 0affffe7-6841-885f-bbcf-00000000030c 10532 1726773077.05292: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030c 10532 1726773077.05295: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773077.05432: no more pending results, returning what we have 8240 1726773077.05435: results queue empty 8240 1726773077.05436: checking for any_errors_fatal 8240 1726773077.05443: done checking for any_errors_fatal 8240 1726773077.05443: checking for max_fail_percentage 8240 1726773077.05445: done checking for max_fail_percentage 8240 1726773077.05445: checking to see if all hosts have failed and the running result is not ok 8240 1726773077.05446: done checking to see if all hosts have failed 8240 1726773077.05447: getting the remaining hosts for this loop 8240 1726773077.05448: done getting the remaining hosts for this loop 8240 1726773077.05451: getting the next task for host managed_node2 8240 1726773077.05457: done getting next task for host managed_node2 8240 1726773077.05460: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8240 1726773077.05463: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773077.05472: getting variables 8240 1726773077.05474: in VariableManager get_vars() 8240 1726773077.05512: Calling all_inventory to load vars for managed_node2 8240 1726773077.05515: Calling groups_inventory to load vars for managed_node2 8240 1726773077.05516: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773077.05527: Calling all_plugins_play to load vars for managed_node2 8240 1726773077.05529: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773077.05532: Calling groups_plugins_play to load vars for managed_node2 8240 1726773077.05647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773077.05769: done with get_vars() 8240 1726773077.05778: done getting variables 8240 1726773077.05824: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.324) 0:00:55.702 **** 8240 1726773077.05846: entering _queue_task() for managed_node2/set_fact 8240 1726773077.06013: worker is 1 (out of 1 available) 8240 1726773077.06026: exiting _queue_task() for managed_node2/set_fact 8240 1726773077.06040: done queuing things up, now waiting for results queue to drain 8240 1726773077.06042: waiting for pending results... 10545 1726773077.06168: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10545 1726773077.06288: in run() - task 0affffe7-6841-885f-bbcf-00000000030d 10545 1726773077.06303: variable 'ansible_search_path' from source: unknown 10545 1726773077.06308: variable 'ansible_search_path' from source: unknown 10545 1726773077.06336: calling self._execute() 10545 1726773077.06492: variable 'ansible_host' from source: host vars for 'managed_node2' 10545 1726773077.06501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10545 1726773077.06510: variable 'omit' from source: magic vars 10545 1726773077.06581: variable 'omit' from source: magic vars 10545 1726773077.06615: variable 'omit' from source: magic vars 10545 1726773077.06926: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10545 1726773077.06937: variable '__cur_profile' from source: task vars 10545 1726773077.07267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10545 1726773077.09403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10545 1726773077.09480: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10545 1726773077.09524: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10545 1726773077.09562: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10545 1726773077.09590: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10545 1726773077.09663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10545 1726773077.09695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10545 1726773077.09722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10545 1726773077.09761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10545 1726773077.09775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10545 1726773077.09888: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10545 1726773077.09941: variable 'omit' from source: magic vars 10545 1726773077.09970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10545 1726773077.09999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10545 1726773077.10019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10545 1726773077.10036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10545 1726773077.10047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10545 1726773077.10077: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10545 1726773077.10084: variable 'ansible_host' from source: host vars for 'managed_node2' 10545 1726773077.10090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10545 1726773077.10189: Set connection var ansible_pipelining to False 10545 1726773077.10198: Set connection var ansible_timeout to 10 10545 1726773077.10207: Set connection var ansible_module_compression to ZIP_DEFLATED 10545 1726773077.10210: Set connection var ansible_shell_type to sh 10545 1726773077.10215: Set connection var ansible_shell_executable to /bin/sh 10545 1726773077.10220: Set connection var ansible_connection to ssh 10545 1726773077.10243: variable 'ansible_shell_executable' from source: unknown 10545 1726773077.10248: variable 'ansible_connection' from source: unknown 10545 1726773077.10251: variable 'ansible_module_compression' from source: unknown 10545 1726773077.10254: variable 'ansible_shell_type' from source: unknown 10545 1726773077.10257: variable 'ansible_shell_executable' from source: unknown 10545 1726773077.10261: variable 'ansible_host' from source: host vars for 'managed_node2' 10545 1726773077.10264: variable 'ansible_pipelining' from source: unknown 10545 1726773077.10268: variable 'ansible_timeout' from source: unknown 10545 1726773077.10272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10545 1726773077.10364: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10545 1726773077.10376: variable 'omit' from source: magic vars 10545 1726773077.10382: starting attempt loop 10545 1726773077.10388: running the handler 10545 1726773077.10399: handler run complete 10545 1726773077.10410: attempt loop complete, returning result 10545 1726773077.10413: _execute() done 10545 1726773077.10416: dumping result to json 10545 1726773077.10419: done dumping result, returning 10545 1726773077.10426: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-885f-bbcf-00000000030d] 10545 1726773077.10432: sending task result for task 0affffe7-6841-885f-bbcf-00000000030d 10545 1726773077.10458: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030d 10545 1726773077.10461: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8240 1726773077.11050: no more pending results, returning what we have 8240 1726773077.11053: results queue empty 8240 1726773077.11055: checking for any_errors_fatal 8240 1726773077.11060: done checking for any_errors_fatal 8240 1726773077.11061: checking for max_fail_percentage 8240 1726773077.11062: done checking for max_fail_percentage 8240 1726773077.11063: checking to see if all hosts have failed and the running result is not ok 8240 1726773077.11064: done checking to see if all hosts have failed 8240 1726773077.11064: getting the remaining hosts for this loop 8240 1726773077.11066: done getting the remaining hosts for this loop 8240 1726773077.11070: getting the next task for host managed_node2 8240 1726773077.11076: done getting next task for host managed_node2 8240 1726773077.11079: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8240 1726773077.11082: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773077.11100: getting variables 8240 1726773077.11102: in VariableManager get_vars() 8240 1726773077.11130: Calling all_inventory to load vars for managed_node2 8240 1726773077.11133: Calling groups_inventory to load vars for managed_node2 8240 1726773077.11136: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773077.11148: Calling all_plugins_play to load vars for managed_node2 8240 1726773077.11151: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773077.11154: Calling groups_plugins_play to load vars for managed_node2 8240 1726773077.11308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773077.11504: done with get_vars() 8240 1726773077.11514: done getting variables 8240 1726773077.11574: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.057) 0:00:55.759 **** 8240 1726773077.11609: entering _queue_task() for managed_node2/copy 8240 1726773077.11795: worker is 1 (out of 1 available) 8240 1726773077.11810: exiting _queue_task() for managed_node2/copy 8240 1726773077.11823: done queuing things up, now waiting for results queue to drain 8240 1726773077.11825: waiting for pending results... 10548 1726773077.11959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10548 1726773077.12081: in run() - task 0affffe7-6841-885f-bbcf-00000000030e 10548 1726773077.12098: variable 'ansible_search_path' from source: unknown 10548 1726773077.12103: variable 'ansible_search_path' from source: unknown 10548 1726773077.12130: calling self._execute() 10548 1726773077.12200: variable 'ansible_host' from source: host vars for 'managed_node2' 10548 1726773077.12209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10548 1726773077.12217: variable 'omit' from source: magic vars 10548 1726773077.12294: variable 'omit' from source: magic vars 10548 1726773077.12327: variable 'omit' from source: magic vars 10548 1726773077.12349: variable '__kernel_settings_active_profile' from source: set_fact 10548 1726773077.12569: variable '__kernel_settings_active_profile' from source: set_fact 10548 1726773077.12596: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10548 1726773077.12648: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10548 1726773077.12706: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10548 1726773077.12729: variable 'omit' from source: magic vars 10548 1726773077.12762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10548 1726773077.12795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10548 1726773077.12815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10548 1726773077.12828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10548 1726773077.12839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10548 1726773077.12863: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10548 1726773077.12869: variable 'ansible_host' from source: host vars for 'managed_node2' 10548 1726773077.12875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10548 1726773077.12942: Set connection var ansible_pipelining to False 10548 1726773077.12949: Set connection var ansible_timeout to 10 10548 1726773077.12958: Set connection var ansible_module_compression to ZIP_DEFLATED 10548 1726773077.12962: Set connection var ansible_shell_type to sh 10548 1726773077.12967: Set connection var ansible_shell_executable to /bin/sh 10548 1726773077.12974: Set connection var ansible_connection to ssh 10548 1726773077.12992: variable 'ansible_shell_executable' from source: unknown 10548 1726773077.12996: variable 'ansible_connection' from source: unknown 10548 1726773077.12999: variable 'ansible_module_compression' from source: unknown 10548 1726773077.13003: variable 'ansible_shell_type' from source: unknown 10548 1726773077.13006: variable 'ansible_shell_executable' from source: unknown 10548 1726773077.13009: variable 'ansible_host' from source: host vars for 'managed_node2' 10548 1726773077.13013: variable 'ansible_pipelining' from source: unknown 10548 1726773077.13017: variable 'ansible_timeout' from source: unknown 10548 1726773077.13021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10548 1726773077.13110: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10548 1726773077.13119: variable 'omit' from source: magic vars 10548 1726773077.13124: starting attempt loop 10548 1726773077.13126: running the handler 10548 1726773077.13136: _low_level_execute_command(): starting 10548 1726773077.13142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10548 1726773077.15478: stdout chunk (state=2): >>>/root <<< 10548 1726773077.15600: stderr chunk (state=3): >>><<< 10548 1726773077.15607: stdout chunk (state=3): >>><<< 10548 1726773077.15625: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10548 1726773077.15638: _low_level_execute_command(): starting 10548 1726773077.15643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493 `" && echo ansible-tmp-1726773077.156331-10548-128016979374493="` echo /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493 `" ) && sleep 0' 10548 1726773077.18401: stdout chunk (state=2): >>>ansible-tmp-1726773077.156331-10548-128016979374493=/root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493 <<< 10548 1726773077.18528: stderr chunk (state=3): >>><<< 10548 1726773077.18535: stdout chunk (state=3): >>><<< 10548 1726773077.18549: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773077.156331-10548-128016979374493=/root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493 , stderr= 10548 1726773077.18623: variable 'ansible_module_compression' from source: unknown 10548 1726773077.18666: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10548 1726773077.18699: variable 'ansible_facts' from source: unknown 10548 1726773077.18766: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_stat.py 10548 1726773077.18854: Sending initial data 10548 1726773077.18862: Sent initial data (151 bytes) 10548 1726773077.21388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp15rgj72h /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_stat.py <<< 10548 1726773077.22507: stderr chunk (state=3): >>><<< 10548 1726773077.22517: stdout chunk (state=3): >>><<< 10548 1726773077.22537: done transferring module to remote 10548 1726773077.22548: _low_level_execute_command(): starting 10548 1726773077.22553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/ /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_stat.py && sleep 0' 10548 1726773077.24997: stderr chunk (state=2): >>><<< 10548 1726773077.25006: stdout chunk (state=2): >>><<< 10548 1726773077.25019: _low_level_execute_command() done: rc=0, stdout=, stderr= 10548 1726773077.25024: _low_level_execute_command(): starting 10548 1726773077.25029: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_stat.py && sleep 0' 10548 1726773077.44693: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773076.9554763, "mtime": 1726773063.2941759, "ctime": 1726773063.2941759, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10548 1726773077.45847: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10548 1726773077.45893: stderr chunk (state=3): >>><<< 10548 1726773077.45900: stdout chunk (state=3): >>><<< 10548 1726773077.45915: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773076.9554763, "mtime": 1726773063.2941759, "ctime": 1726773063.2941759, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10548 1726773077.45960: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10548 1726773077.45997: variable 'ansible_module_compression' from source: unknown 10548 1726773077.46029: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10548 1726773077.46047: variable 'ansible_facts' from source: unknown 10548 1726773077.46103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_file.py 10548 1726773077.46194: Sending initial data 10548 1726773077.46201: Sent initial data (151 bytes) 10548 1726773077.48745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpi76_wc0g /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_file.py <<< 10548 1726773077.49890: stderr chunk (state=3): >>><<< 10548 1726773077.49897: stdout chunk (state=3): >>><<< 10548 1726773077.49915: done transferring module to remote 10548 1726773077.49924: _low_level_execute_command(): starting 10548 1726773077.49930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/ /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_file.py && sleep 0' 10548 1726773077.52393: stderr chunk (state=2): >>><<< 10548 1726773077.52404: stdout chunk (state=2): >>><<< 10548 1726773077.52418: _low_level_execute_command() done: rc=0, stdout=, stderr= 10548 1726773077.52422: _low_level_execute_command(): starting 10548 1726773077.52427: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/AnsiballZ_file.py && sleep 0' 10548 1726773077.68512: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp2q46g8lp", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10548 1726773077.69630: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10548 1726773077.69679: stderr chunk (state=3): >>><<< 10548 1726773077.69688: stdout chunk (state=3): >>><<< 10548 1726773077.69705: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmp2q46g8lp", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10548 1726773077.69732: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmp2q46g8lp', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10548 1726773077.69743: _low_level_execute_command(): starting 10548 1726773077.69749: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773077.156331-10548-128016979374493/ > /dev/null 2>&1 && sleep 0' 10548 1726773077.72193: stderr chunk (state=2): >>><<< 10548 1726773077.72202: stdout chunk (state=2): >>><<< 10548 1726773077.72218: _low_level_execute_command() done: rc=0, stdout=, stderr= 10548 1726773077.72227: handler run complete 10548 1726773077.72248: attempt loop complete, returning result 10548 1726773077.72251: _execute() done 10548 1726773077.72254: dumping result to json 10548 1726773077.72260: done dumping result, returning 10548 1726773077.72268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-885f-bbcf-00000000030e] 10548 1726773077.72276: sending task result for task 0affffe7-6841-885f-bbcf-00000000030e 10548 1726773077.72312: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030e 10548 1726773077.72315: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8240 1726773077.72558: no more pending results, returning what we have 8240 1726773077.72561: results queue empty 8240 1726773077.72561: checking for any_errors_fatal 8240 1726773077.72566: done checking for any_errors_fatal 8240 1726773077.72566: checking for max_fail_percentage 8240 1726773077.72567: done checking for max_fail_percentage 8240 1726773077.72568: checking to see if all hosts have failed and the running result is not ok 8240 1726773077.72568: done checking to see if all hosts have failed 8240 1726773077.72569: getting the remaining hosts for this loop 8240 1726773077.72570: done getting the remaining hosts for this loop 8240 1726773077.72572: getting the next task for host managed_node2 8240 1726773077.72577: done getting next task for host managed_node2 8240 1726773077.72580: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8240 1726773077.72582: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773077.72592: getting variables 8240 1726773077.72593: in VariableManager get_vars() 8240 1726773077.72619: Calling all_inventory to load vars for managed_node2 8240 1726773077.72621: Calling groups_inventory to load vars for managed_node2 8240 1726773077.72622: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773077.72630: Calling all_plugins_play to load vars for managed_node2 8240 1726773077.72632: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773077.72633: Calling groups_plugins_play to load vars for managed_node2 8240 1726773077.72758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773077.72902: done with get_vars() 8240 1726773077.72909: done getting variables 8240 1726773077.72952: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:17 -0400 (0:00:00.613) 0:00:56.373 **** 8240 1726773077.72974: entering _queue_task() for managed_node2/copy 8240 1726773077.73166: worker is 1 (out of 1 available) 8240 1726773077.73179: exiting _queue_task() for managed_node2/copy 8240 1726773077.73192: done queuing things up, now waiting for results queue to drain 8240 1726773077.73194: waiting for pending results... 10573 1726773077.73452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10573 1726773077.73584: in run() - task 0affffe7-6841-885f-bbcf-00000000030f 10573 1726773077.73603: variable 'ansible_search_path' from source: unknown 10573 1726773077.73607: variable 'ansible_search_path' from source: unknown 10573 1726773077.73638: calling self._execute() 10573 1726773077.73725: variable 'ansible_host' from source: host vars for 'managed_node2' 10573 1726773077.73735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10573 1726773077.73743: variable 'omit' from source: magic vars 10573 1726773077.73835: variable 'omit' from source: magic vars 10573 1726773077.73884: variable 'omit' from source: magic vars 10573 1726773077.73908: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10573 1726773077.74143: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10573 1726773077.74206: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10573 1726773077.74235: variable 'omit' from source: magic vars 10573 1726773077.74266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10573 1726773077.74294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10573 1726773077.74313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10573 1726773077.74327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10573 1726773077.74338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10573 1726773077.74363: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10573 1726773077.74368: variable 'ansible_host' from source: host vars for 'managed_node2' 10573 1726773077.74373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10573 1726773077.74443: Set connection var ansible_pipelining to False 10573 1726773077.74450: Set connection var ansible_timeout to 10 10573 1726773077.74458: Set connection var ansible_module_compression to ZIP_DEFLATED 10573 1726773077.74462: Set connection var ansible_shell_type to sh 10573 1726773077.74468: Set connection var ansible_shell_executable to /bin/sh 10573 1726773077.74473: Set connection var ansible_connection to ssh 10573 1726773077.74492: variable 'ansible_shell_executable' from source: unknown 10573 1726773077.74496: variable 'ansible_connection' from source: unknown 10573 1726773077.74499: variable 'ansible_module_compression' from source: unknown 10573 1726773077.74502: variable 'ansible_shell_type' from source: unknown 10573 1726773077.74506: variable 'ansible_shell_executable' from source: unknown 10573 1726773077.74509: variable 'ansible_host' from source: host vars for 'managed_node2' 10573 1726773077.74513: variable 'ansible_pipelining' from source: unknown 10573 1726773077.74517: variable 'ansible_timeout' from source: unknown 10573 1726773077.74521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10573 1726773077.74613: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10573 1726773077.74625: variable 'omit' from source: magic vars 10573 1726773077.74632: starting attempt loop 10573 1726773077.74635: running the handler 10573 1726773077.74646: _low_level_execute_command(): starting 10573 1726773077.74654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10573 1726773077.77194: stdout chunk (state=2): >>>/root <<< 10573 1726773077.77207: stderr chunk (state=2): >>><<< 10573 1726773077.77223: stdout chunk (state=3): >>><<< 10573 1726773077.77243: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10573 1726773077.77260: _low_level_execute_command(): starting 10573 1726773077.77267: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831 `" && echo ansible-tmp-1726773077.772523-10573-8100730154831="` echo /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831 `" ) && sleep 0' 10573 1726773077.80180: stdout chunk (state=2): >>>ansible-tmp-1726773077.772523-10573-8100730154831=/root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831 <<< 10573 1726773077.80337: stderr chunk (state=3): >>><<< 10573 1726773077.80348: stdout chunk (state=3): >>><<< 10573 1726773077.80369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773077.772523-10573-8100730154831=/root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831 , stderr= 10573 1726773077.80462: variable 'ansible_module_compression' from source: unknown 10573 1726773077.80527: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10573 1726773077.80560: variable 'ansible_facts' from source: unknown 10573 1726773077.80646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_stat.py 10573 1726773077.80891: Sending initial data 10573 1726773077.80898: Sent initial data (149 bytes) 10573 1726773077.83739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpj481fqcm /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_stat.py <<< 10573 1726773077.84767: stderr chunk (state=3): >>><<< 10573 1726773077.84775: stdout chunk (state=3): >>><<< 10573 1726773077.84796: done transferring module to remote 10573 1726773077.84808: _low_level_execute_command(): starting 10573 1726773077.84812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/ /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_stat.py && sleep 0' 10573 1726773077.87177: stderr chunk (state=2): >>><<< 10573 1726773077.87189: stdout chunk (state=2): >>><<< 10573 1726773077.87205: _low_level_execute_command() done: rc=0, stdout=, stderr= 10573 1726773077.87209: _low_level_execute_command(): starting 10573 1726773077.87215: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_stat.py && sleep 0' 10573 1726773078.03612: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773063.1641746, "mtime": 1726773063.2951758, "ctime": 1726773063.2951758, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10573 1726773078.04743: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10573 1726773078.04794: stderr chunk (state=3): >>><<< 10573 1726773078.04801: stdout chunk (state=3): >>><<< 10573 1726773078.04817: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773063.1641746, "mtime": 1726773063.2951758, "ctime": 1726773063.2951758, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10573 1726773078.04858: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10573 1726773078.04901: variable 'ansible_module_compression' from source: unknown 10573 1726773078.04934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10573 1726773078.04951: variable 'ansible_facts' from source: unknown 10573 1726773078.05013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_file.py 10573 1726773078.05101: Sending initial data 10573 1726773078.05108: Sent initial data (149 bytes) 10573 1726773078.07644: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp7til6t7t /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_file.py <<< 10573 1726773078.08820: stderr chunk (state=3): >>><<< 10573 1726773078.08828: stdout chunk (state=3): >>><<< 10573 1726773078.08847: done transferring module to remote 10573 1726773078.08856: _low_level_execute_command(): starting 10573 1726773078.08861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/ /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_file.py && sleep 0' 10573 1726773078.11233: stderr chunk (state=2): >>><<< 10573 1726773078.11244: stdout chunk (state=2): >>><<< 10573 1726773078.11260: _low_level_execute_command() done: rc=0, stdout=, stderr= 10573 1726773078.11264: _low_level_execute_command(): starting 10573 1726773078.11270: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/AnsiballZ_file.py && sleep 0' 10573 1726773078.27211: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpcr9jvnao", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10573 1726773078.28349: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10573 1726773078.28402: stderr chunk (state=3): >>><<< 10573 1726773078.28410: stdout chunk (state=3): >>><<< 10573 1726773078.28427: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpcr9jvnao", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10573 1726773078.28455: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpcr9jvnao', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10573 1726773078.28466: _low_level_execute_command(): starting 10573 1726773078.28472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773077.772523-10573-8100730154831/ > /dev/null 2>&1 && sleep 0' 10573 1726773078.30915: stderr chunk (state=2): >>><<< 10573 1726773078.30924: stdout chunk (state=2): >>><<< 10573 1726773078.30939: _low_level_execute_command() done: rc=0, stdout=, stderr= 10573 1726773078.30947: handler run complete 10573 1726773078.30968: attempt loop complete, returning result 10573 1726773078.30975: _execute() done 10573 1726773078.30979: dumping result to json 10573 1726773078.30984: done dumping result, returning 10573 1726773078.30993: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-885f-bbcf-00000000030f] 10573 1726773078.31000: sending task result for task 0affffe7-6841-885f-bbcf-00000000030f 10573 1726773078.31031: done sending task result for task 0affffe7-6841-885f-bbcf-00000000030f 10573 1726773078.31034: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8240 1726773078.31203: no more pending results, returning what we have 8240 1726773078.31207: results queue empty 8240 1726773078.31208: checking for any_errors_fatal 8240 1726773078.31214: done checking for any_errors_fatal 8240 1726773078.31214: checking for max_fail_percentage 8240 1726773078.31216: done checking for max_fail_percentage 8240 1726773078.31216: checking to see if all hosts have failed and the running result is not ok 8240 1726773078.31217: done checking to see if all hosts have failed 8240 1726773078.31218: getting the remaining hosts for this loop 8240 1726773078.31219: done getting the remaining hosts for this loop 8240 1726773078.31222: getting the next task for host managed_node2 8240 1726773078.31228: done getting next task for host managed_node2 8240 1726773078.31231: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8240 1726773078.31234: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773078.31244: getting variables 8240 1726773078.31245: in VariableManager get_vars() 8240 1726773078.31278: Calling all_inventory to load vars for managed_node2 8240 1726773078.31281: Calling groups_inventory to load vars for managed_node2 8240 1726773078.31283: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773078.31294: Calling all_plugins_play to load vars for managed_node2 8240 1726773078.31298: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773078.31302: Calling groups_plugins_play to load vars for managed_node2 8240 1726773078.31416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773078.31537: done with get_vars() 8240 1726773078.31547: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.586) 0:00:56.959 **** 8240 1726773078.31611: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773078.31774: worker is 1 (out of 1 available) 8240 1726773078.31789: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773078.31803: done queuing things up, now waiting for results queue to drain 8240 1726773078.31806: waiting for pending results... 10601 1726773078.31931: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10601 1726773078.32047: in run() - task 0affffe7-6841-885f-bbcf-000000000310 10601 1726773078.32063: variable 'ansible_search_path' from source: unknown 10601 1726773078.32067: variable 'ansible_search_path' from source: unknown 10601 1726773078.32098: calling self._execute() 10601 1726773078.32167: variable 'ansible_host' from source: host vars for 'managed_node2' 10601 1726773078.32178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10601 1726773078.32189: variable 'omit' from source: magic vars 10601 1726773078.32267: variable 'omit' from source: magic vars 10601 1726773078.32306: variable 'omit' from source: magic vars 10601 1726773078.32328: variable '__kernel_settings_profile_filename' from source: role '' all vars 10601 1726773078.32548: variable '__kernel_settings_profile_filename' from source: role '' all vars 10601 1726773078.32613: variable '__kernel_settings_profile_dir' from source: role '' all vars 10601 1726773078.32675: variable '__kernel_settings_profile_parent' from source: set_fact 10601 1726773078.32686: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10601 1726773078.32774: variable 'omit' from source: magic vars 10601 1726773078.32810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10601 1726773078.32836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10601 1726773078.32854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10601 1726773078.32868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10601 1726773078.32881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10601 1726773078.32908: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10601 1726773078.32914: variable 'ansible_host' from source: host vars for 'managed_node2' 10601 1726773078.32919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10601 1726773078.32990: Set connection var ansible_pipelining to False 10601 1726773078.32997: Set connection var ansible_timeout to 10 10601 1726773078.33005: Set connection var ansible_module_compression to ZIP_DEFLATED 10601 1726773078.33009: Set connection var ansible_shell_type to sh 10601 1726773078.33015: Set connection var ansible_shell_executable to /bin/sh 10601 1726773078.33021: Set connection var ansible_connection to ssh 10601 1726773078.33036: variable 'ansible_shell_executable' from source: unknown 10601 1726773078.33040: variable 'ansible_connection' from source: unknown 10601 1726773078.33043: variable 'ansible_module_compression' from source: unknown 10601 1726773078.33046: variable 'ansible_shell_type' from source: unknown 10601 1726773078.33050: variable 'ansible_shell_executable' from source: unknown 10601 1726773078.33053: variable 'ansible_host' from source: host vars for 'managed_node2' 10601 1726773078.33057: variable 'ansible_pipelining' from source: unknown 10601 1726773078.33060: variable 'ansible_timeout' from source: unknown 10601 1726773078.33064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10601 1726773078.33195: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10601 1726773078.33206: variable 'omit' from source: magic vars 10601 1726773078.33212: starting attempt loop 10601 1726773078.33215: running the handler 10601 1726773078.33227: _low_level_execute_command(): starting 10601 1726773078.33235: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10601 1726773078.35565: stdout chunk (state=2): >>>/root <<< 10601 1726773078.35686: stderr chunk (state=3): >>><<< 10601 1726773078.35694: stdout chunk (state=3): >>><<< 10601 1726773078.35714: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10601 1726773078.35729: _low_level_execute_command(): starting 10601 1726773078.35735: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144 `" && echo ansible-tmp-1726773078.3572316-10601-50791067054144="` echo /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144 `" ) && sleep 0' 10601 1726773078.38351: stdout chunk (state=2): >>>ansible-tmp-1726773078.3572316-10601-50791067054144=/root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144 <<< 10601 1726773078.38478: stderr chunk (state=3): >>><<< 10601 1726773078.38487: stdout chunk (state=3): >>><<< 10601 1726773078.38504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773078.3572316-10601-50791067054144=/root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144 , stderr= 10601 1726773078.38543: variable 'ansible_module_compression' from source: unknown 10601 1726773078.38577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10601 1726773078.38610: variable 'ansible_facts' from source: unknown 10601 1726773078.38676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/AnsiballZ_kernel_settings_get_config.py 10601 1726773078.38778: Sending initial data 10601 1726773078.38788: Sent initial data (173 bytes) 10601 1726773078.41593: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpyi3uxj81 /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/AnsiballZ_kernel_settings_get_config.py <<< 10601 1726773078.42851: stderr chunk (state=3): >>><<< 10601 1726773078.42859: stdout chunk (state=3): >>><<< 10601 1726773078.42876: done transferring module to remote 10601 1726773078.42888: _low_level_execute_command(): starting 10601 1726773078.42894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/ /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10601 1726773078.45237: stderr chunk (state=2): >>><<< 10601 1726773078.45246: stdout chunk (state=2): >>><<< 10601 1726773078.45259: _low_level_execute_command() done: rc=0, stdout=, stderr= 10601 1726773078.45264: _low_level_execute_command(): starting 10601 1726773078.45271: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10601 1726773078.60647: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10601 1726773078.61691: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10601 1726773078.61739: stderr chunk (state=3): >>><<< 10601 1726773078.61745: stdout chunk (state=3): >>><<< 10601 1726773078.61763: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 10601 1726773078.61794: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10601 1726773078.61805: _low_level_execute_command(): starting 10601 1726773078.61811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773078.3572316-10601-50791067054144/ > /dev/null 2>&1 && sleep 0' 10601 1726773078.64221: stderr chunk (state=2): >>><<< 10601 1726773078.64230: stdout chunk (state=2): >>><<< 10601 1726773078.64243: _low_level_execute_command() done: rc=0, stdout=, stderr= 10601 1726773078.64250: handler run complete 10601 1726773078.64265: attempt loop complete, returning result 10601 1726773078.64268: _execute() done 10601 1726773078.64272: dumping result to json 10601 1726773078.64276: done dumping result, returning 10601 1726773078.64284: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-885f-bbcf-000000000310] 10601 1726773078.64293: sending task result for task 0affffe7-6841-885f-bbcf-000000000310 10601 1726773078.64322: done sending task result for task 0affffe7-6841-885f-bbcf-000000000310 10601 1726773078.64325: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400000", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8240 1726773078.64481: no more pending results, returning what we have 8240 1726773078.64487: results queue empty 8240 1726773078.64488: checking for any_errors_fatal 8240 1726773078.64495: done checking for any_errors_fatal 8240 1726773078.64495: checking for max_fail_percentage 8240 1726773078.64497: done checking for max_fail_percentage 8240 1726773078.64497: checking to see if all hosts have failed and the running result is not ok 8240 1726773078.64498: done checking to see if all hosts have failed 8240 1726773078.64499: getting the remaining hosts for this loop 8240 1726773078.64500: done getting the remaining hosts for this loop 8240 1726773078.64503: getting the next task for host managed_node2 8240 1726773078.64510: done getting next task for host managed_node2 8240 1726773078.64513: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8240 1726773078.64516: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773078.64526: getting variables 8240 1726773078.64527: in VariableManager get_vars() 8240 1726773078.64560: Calling all_inventory to load vars for managed_node2 8240 1726773078.64563: Calling groups_inventory to load vars for managed_node2 8240 1726773078.64564: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773078.64576: Calling all_plugins_play to load vars for managed_node2 8240 1726773078.64578: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773078.64581: Calling groups_plugins_play to load vars for managed_node2 8240 1726773078.64731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773078.64848: done with get_vars() 8240 1726773078.64856: done getting variables 8240 1726773078.64904: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:18 -0400 (0:00:00.333) 0:00:57.293 **** 8240 1726773078.64927: entering _queue_task() for managed_node2/template 8240 1726773078.65089: worker is 1 (out of 1 available) 8240 1726773078.65102: exiting _queue_task() for managed_node2/template 8240 1726773078.65115: done queuing things up, now waiting for results queue to drain 8240 1726773078.65116: waiting for pending results... 10617 1726773078.65237: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10617 1726773078.65349: in run() - task 0affffe7-6841-885f-bbcf-000000000311 10617 1726773078.65364: variable 'ansible_search_path' from source: unknown 10617 1726773078.65368: variable 'ansible_search_path' from source: unknown 10617 1726773078.65397: calling self._execute() 10617 1726773078.65462: variable 'ansible_host' from source: host vars for 'managed_node2' 10617 1726773078.65471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10617 1726773078.65479: variable 'omit' from source: magic vars 10617 1726773078.65555: variable 'omit' from source: magic vars 10617 1726773078.65590: variable 'omit' from source: magic vars 10617 1726773078.65818: variable '__kernel_settings_profile_src' from source: role '' all vars 10617 1726773078.65828: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10617 1726773078.65883: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10617 1726773078.65905: variable '__kernel_settings_profile_filename' from source: role '' all vars 10617 1726773078.65951: variable '__kernel_settings_profile_filename' from source: role '' all vars 10617 1726773078.66004: variable '__kernel_settings_profile_dir' from source: role '' all vars 10617 1726773078.66060: variable '__kernel_settings_profile_parent' from source: set_fact 10617 1726773078.66069: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10617 1726773078.66100: variable 'omit' from source: magic vars 10617 1726773078.66131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10617 1726773078.66156: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10617 1726773078.66173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10617 1726773078.66191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10617 1726773078.66203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10617 1726773078.66227: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10617 1726773078.66233: variable 'ansible_host' from source: host vars for 'managed_node2' 10617 1726773078.66237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10617 1726773078.66306: Set connection var ansible_pipelining to False 10617 1726773078.66314: Set connection var ansible_timeout to 10 10617 1726773078.66321: Set connection var ansible_module_compression to ZIP_DEFLATED 10617 1726773078.66324: Set connection var ansible_shell_type to sh 10617 1726773078.66327: Set connection var ansible_shell_executable to /bin/sh 10617 1726773078.66329: Set connection var ansible_connection to ssh 10617 1726773078.66342: variable 'ansible_shell_executable' from source: unknown 10617 1726773078.66345: variable 'ansible_connection' from source: unknown 10617 1726773078.66347: variable 'ansible_module_compression' from source: unknown 10617 1726773078.66349: variable 'ansible_shell_type' from source: unknown 10617 1726773078.66350: variable 'ansible_shell_executable' from source: unknown 10617 1726773078.66352: variable 'ansible_host' from source: host vars for 'managed_node2' 10617 1726773078.66354: variable 'ansible_pipelining' from source: unknown 10617 1726773078.66355: variable 'ansible_timeout' from source: unknown 10617 1726773078.66358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10617 1726773078.66447: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10617 1726773078.66457: variable 'omit' from source: magic vars 10617 1726773078.66461: starting attempt loop 10617 1726773078.66463: running the handler 10617 1726773078.66471: _low_level_execute_command(): starting 10617 1726773078.66477: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10617 1726773078.68751: stdout chunk (state=2): >>>/root <<< 10617 1726773078.68863: stderr chunk (state=3): >>><<< 10617 1726773078.68869: stdout chunk (state=3): >>><<< 10617 1726773078.68891: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10617 1726773078.68903: _low_level_execute_command(): starting 10617 1726773078.68907: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077 `" && echo ansible-tmp-1726773078.6889782-10617-112521471970077="` echo /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077 `" ) && sleep 0' 10617 1726773078.71790: stdout chunk (state=2): >>>ansible-tmp-1726773078.6889782-10617-112521471970077=/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077 <<< 10617 1726773078.71801: stderr chunk (state=2): >>><<< 10617 1726773078.71811: stdout chunk (state=3): >>><<< 10617 1726773078.71835: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773078.6889782-10617-112521471970077=/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077 , stderr= 10617 1726773078.71858: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10617 1726773078.71882: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10617 1726773078.71908: variable 'ansible_search_path' from source: unknown 10617 1726773078.72503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10617 1726773078.73996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10617 1726773078.74051: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10617 1726773078.74081: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10617 1726773078.74115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10617 1726773078.74145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10617 1726773078.74432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10617 1726773078.74449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10617 1726773078.74470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10617 1726773078.74512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10617 1726773078.74527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10617 1726773078.74944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10617 1726773078.74970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10617 1726773078.75001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10617 1726773078.75042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10617 1726773078.75062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10617 1726773078.75409: variable 'ansible_managed' from source: unknown 10617 1726773078.75416: variable '__sections' from source: task vars 10617 1726773078.75501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10617 1726773078.75522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10617 1726773078.75550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10617 1726773078.75590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10617 1726773078.75603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10617 1726773078.75700: variable 'kernel_settings_sysctl' from source: include params 10617 1726773078.75711: variable '__kernel_settings_state_empty' from source: role '' all vars 10617 1726773078.75717: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10617 1726773078.75768: variable '__sysctl_old' from source: task vars 10617 1726773078.75833: variable '__sysctl_old' from source: task vars 10617 1726773078.76061: variable 'kernel_settings_purge' from source: role '' defaults 10617 1726773078.76069: variable 'kernel_settings_sysctl' from source: include params 10617 1726773078.76078: variable '__kernel_settings_state_empty' from source: role '' all vars 10617 1726773078.76083: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10617 1726773078.76089: variable '__kernel_settings_profile_contents' from source: set_fact 10617 1726773078.76298: variable 'kernel_settings_sysfs' from source: role '' defaults 10617 1726773078.76306: variable '__kernel_settings_state_empty' from source: role '' all vars 10617 1726773078.76312: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10617 1726773078.76327: variable '__sysfs_old' from source: task vars 10617 1726773078.76393: variable '__sysfs_old' from source: task vars 10617 1726773078.76541: variable 'kernel_settings_purge' from source: role '' defaults 10617 1726773078.76549: variable 'kernel_settings_sysfs' from source: role '' defaults 10617 1726773078.76554: variable '__kernel_settings_state_empty' from source: role '' all vars 10617 1726773078.76560: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10617 1726773078.76565: variable '__kernel_settings_profile_contents' from source: set_fact 10617 1726773078.76608: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10617 1726773078.76617: variable '__systemd_old' from source: task vars 10617 1726773078.76657: variable '__systemd_old' from source: task vars 10617 1726773078.76793: variable 'kernel_settings_purge' from source: role '' defaults 10617 1726773078.76800: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10617 1726773078.76805: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.76811: variable '__kernel_settings_profile_contents' from source: set_fact 10617 1726773078.76823: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10617 1726773078.76828: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10617 1726773078.76833: variable '__trans_huge_old' from source: task vars 10617 1726773078.76874: variable '__trans_huge_old' from source: task vars 10617 1726773078.77005: variable 'kernel_settings_purge' from source: role '' defaults 10617 1726773078.77011: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10617 1726773078.77017: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77022: variable '__kernel_settings_profile_contents' from source: set_fact 10617 1726773078.77033: variable '__trans_defrag_old' from source: task vars 10617 1726773078.77075: variable '__trans_defrag_old' from source: task vars 10617 1726773078.77206: variable 'kernel_settings_purge' from source: role '' defaults 10617 1726773078.77213: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10617 1726773078.77217: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77223: variable '__kernel_settings_profile_contents' from source: set_fact 10617 1726773078.77238: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77247: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77262: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77270: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77278: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77294: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77301: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77306: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77311: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77319: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77325: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77331: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77337: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77343: variable '__kernel_settings_state_absent' from source: role '' all vars 10617 1726773078.77793: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10617 1726773078.77834: variable 'ansible_module_compression' from source: unknown 10617 1726773078.77876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10617 1726773078.77898: variable 'ansible_facts' from source: unknown 10617 1726773078.77963: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_stat.py 10617 1726773078.78053: Sending initial data 10617 1726773078.78061: Sent initial data (152 bytes) 10617 1726773078.80791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpwknuzhk7 /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_stat.py <<< 10617 1726773078.81909: stderr chunk (state=3): >>><<< 10617 1726773078.81923: stdout chunk (state=3): >>><<< 10617 1726773078.81944: done transferring module to remote 10617 1726773078.81956: _low_level_execute_command(): starting 10617 1726773078.81962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/ /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_stat.py && sleep 0' 10617 1726773078.84346: stderr chunk (state=2): >>><<< 10617 1726773078.84355: stdout chunk (state=2): >>><<< 10617 1726773078.84370: _low_level_execute_command() done: rc=0, stdout=, stderr= 10617 1726773078.84378: _low_level_execute_command(): starting 10617 1726773078.84383: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_stat.py && sleep 0' 10617 1726773079.00995: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 8388806, "dev": 51713, "nlink": 1, "atime": 1726773042.2231216, "mtime": 1726773041.2131114, "ctime": 1726773041.5001142, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "1062638163", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10617 1726773079.02001: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10617 1726773079.02011: stdout chunk (state=3): >>><<< 10617 1726773079.02021: stderr chunk (state=3): >>><<< 10617 1726773079.02032: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 8388806, "dev": 51713, "nlink": 1, "atime": 1726773042.2231216, "mtime": 1726773041.2131114, "ctime": 1726773041.5001142, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3feaf86b2638623e3300792e683ce55f91f31e9a", "mimetype": "text/plain", "charset": "us-ascii", "version": "1062638163", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10617 1726773079.02070: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10617 1726773079.02399: Sending initial data 10617 1726773079.02406: Sent initial data (160 bytes) 10617 1726773079.04709: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmproczjcda/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source <<< 10617 1726773079.05075: stderr chunk (state=3): >>><<< 10617 1726773079.05082: stdout chunk (state=3): >>><<< 10617 1726773079.05099: _low_level_execute_command(): starting 10617 1726773079.05104: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/ /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source && sleep 0' 10617 1726773079.07417: stderr chunk (state=2): >>><<< 10617 1726773079.07426: stdout chunk (state=2): >>><<< 10617 1726773079.07440: _low_level_execute_command() done: rc=0, stdout=, stderr= 10617 1726773079.07460: variable 'ansible_module_compression' from source: unknown 10617 1726773079.07499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10617 1726773079.07517: variable 'ansible_facts' from source: unknown 10617 1726773079.07575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_copy.py 10617 1726773079.07658: Sending initial data 10617 1726773079.07665: Sent initial data (152 bytes) 10617 1726773079.10287: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpysvij6qd /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_copy.py <<< 10617 1726773079.11523: stderr chunk (state=3): >>><<< 10617 1726773079.11532: stdout chunk (state=3): >>><<< 10617 1726773079.11553: done transferring module to remote 10617 1726773079.11563: _low_level_execute_command(): starting 10617 1726773079.11568: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/ /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_copy.py && sleep 0' 10617 1726773079.13921: stderr chunk (state=2): >>><<< 10617 1726773079.13932: stdout chunk (state=2): >>><<< 10617 1726773079.13948: _low_level_execute_command() done: rc=0, stdout=, stderr= 10617 1726773079.13953: _low_level_execute_command(): starting 10617 1726773079.13958: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/AnsiballZ_copy.py && sleep 0' 10617 1726773079.30653: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10617 1726773079.31815: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10617 1726773079.31858: stderr chunk (state=3): >>><<< 10617 1726773079.31864: stdout chunk (state=3): >>><<< 10617 1726773079.31882: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10617 1726773079.31910: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '221aa34fef95c2fe05408be9921820449785a5b2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10617 1726773079.31938: _low_level_execute_command(): starting 10617 1726773079.31946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/ > /dev/null 2>&1 && sleep 0' 10617 1726773079.34563: stderr chunk (state=2): >>><<< 10617 1726773079.34578: stdout chunk (state=2): >>><<< 10617 1726773079.34598: _low_level_execute_command() done: rc=0, stdout=, stderr= 10617 1726773079.34610: handler run complete 10617 1726773079.34638: attempt loop complete, returning result 10617 1726773079.34644: _execute() done 10617 1726773079.34647: dumping result to json 10617 1726773079.34653: done dumping result, returning 10617 1726773079.34662: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-885f-bbcf-000000000311] 10617 1726773079.34668: sending task result for task 0affffe7-6841-885f-bbcf-000000000311 10617 1726773079.34735: done sending task result for task 0affffe7-6841-885f-bbcf-000000000311 10617 1726773079.34739: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "9c36ebcc135366fa59ab6f2f2da76a73", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 351, "src": "/root/.ansible/tmp/ansible-tmp-1726773078.6889782-10617-112521471970077/source", "state": "file", "uid": 0 } 8240 1726773079.35233: no more pending results, returning what we have 8240 1726773079.35236: results queue empty 8240 1726773079.35237: checking for any_errors_fatal 8240 1726773079.35243: done checking for any_errors_fatal 8240 1726773079.35244: checking for max_fail_percentage 8240 1726773079.35246: done checking for max_fail_percentage 8240 1726773079.35247: checking to see if all hosts have failed and the running result is not ok 8240 1726773079.35247: done checking to see if all hosts have failed 8240 1726773079.35248: getting the remaining hosts for this loop 8240 1726773079.35249: done getting the remaining hosts for this loop 8240 1726773079.35253: getting the next task for host managed_node2 8240 1726773079.35259: done getting next task for host managed_node2 8240 1726773079.35262: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8240 1726773079.35265: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773079.35279: getting variables 8240 1726773079.35281: in VariableManager get_vars() 8240 1726773079.35318: Calling all_inventory to load vars for managed_node2 8240 1726773079.35321: Calling groups_inventory to load vars for managed_node2 8240 1726773079.35324: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773079.35335: Calling all_plugins_play to load vars for managed_node2 8240 1726773079.35338: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773079.35342: Calling groups_plugins_play to load vars for managed_node2 8240 1726773079.35520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773079.35729: done with get_vars() 8240 1726773079.35741: done getting variables 8240 1726773079.35808: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.709) 0:00:58.002 **** 8240 1726773079.35841: entering _queue_task() for managed_node2/service 8240 1726773079.36034: worker is 1 (out of 1 available) 8240 1726773079.36053: exiting _queue_task() for managed_node2/service 8240 1726773079.36068: done queuing things up, now waiting for results queue to drain 8240 1726773079.36070: waiting for pending results... 10650 1726773079.36192: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10650 1726773079.36309: in run() - task 0affffe7-6841-885f-bbcf-000000000312 10650 1726773079.36326: variable 'ansible_search_path' from source: unknown 10650 1726773079.36330: variable 'ansible_search_path' from source: unknown 10650 1726773079.36365: variable '__kernel_settings_services' from source: include_vars 10650 1726773079.36597: variable '__kernel_settings_services' from source: include_vars 10650 1726773079.36727: variable 'omit' from source: magic vars 10650 1726773079.36805: variable 'ansible_host' from source: host vars for 'managed_node2' 10650 1726773079.36815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10650 1726773079.36824: variable 'omit' from source: magic vars 10650 1726773079.37000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10650 1726773079.37164: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10650 1726773079.37201: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10650 1726773079.37226: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10650 1726773079.37251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10650 1726773079.37324: variable '__kernel_settings_register_profile' from source: set_fact 10650 1726773079.37335: variable '__kernel_settings_register_mode' from source: set_fact 10650 1726773079.37352: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 10650 1726773079.37356: when evaluation is False, skipping this task 10650 1726773079.37378: variable 'item' from source: unknown 10650 1726773079.37425: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 10650 1726773079.37453: dumping result to json 10650 1726773079.37459: done dumping result, returning 10650 1726773079.37464: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-885f-bbcf-000000000312] 10650 1726773079.37470: sending task result for task 0affffe7-6841-885f-bbcf-000000000312 10650 1726773079.37494: done sending task result for task 0affffe7-6841-885f-bbcf-000000000312 10650 1726773079.37498: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8240 1726773079.37666: no more pending results, returning what we have 8240 1726773079.37669: results queue empty 8240 1726773079.37670: checking for any_errors_fatal 8240 1726773079.37682: done checking for any_errors_fatal 8240 1726773079.37683: checking for max_fail_percentage 8240 1726773079.37684: done checking for max_fail_percentage 8240 1726773079.37687: checking to see if all hosts have failed and the running result is not ok 8240 1726773079.37688: done checking to see if all hosts have failed 8240 1726773079.37688: getting the remaining hosts for this loop 8240 1726773079.37689: done getting the remaining hosts for this loop 8240 1726773079.37692: getting the next task for host managed_node2 8240 1726773079.37702: done getting next task for host managed_node2 8240 1726773079.37705: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8240 1726773079.37708: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773079.37722: getting variables 8240 1726773079.37724: in VariableManager get_vars() 8240 1726773079.37762: Calling all_inventory to load vars for managed_node2 8240 1726773079.37765: Calling groups_inventory to load vars for managed_node2 8240 1726773079.37767: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773079.37779: Calling all_plugins_play to load vars for managed_node2 8240 1726773079.37782: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773079.37786: Calling groups_plugins_play to load vars for managed_node2 8240 1726773079.37942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773079.38148: done with get_vars() 8240 1726773079.38159: done getting variables 8240 1726773079.38219: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:19 -0400 (0:00:00.024) 0:00:58.026 **** 8240 1726773079.38247: entering _queue_task() for managed_node2/command 8240 1726773079.38446: worker is 1 (out of 1 available) 8240 1726773079.38463: exiting _queue_task() for managed_node2/command 8240 1726773079.38480: done queuing things up, now waiting for results queue to drain 8240 1726773079.38481: waiting for pending results... 10652 1726773079.38652: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10652 1726773079.38766: in run() - task 0affffe7-6841-885f-bbcf-000000000313 10652 1726773079.38782: variable 'ansible_search_path' from source: unknown 10652 1726773079.38789: variable 'ansible_search_path' from source: unknown 10652 1726773079.38815: calling self._execute() 10652 1726773079.38880: variable 'ansible_host' from source: host vars for 'managed_node2' 10652 1726773079.38889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10652 1726773079.38895: variable 'omit' from source: magic vars 10652 1726773079.39218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10652 1726773079.39474: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10652 1726773079.39509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10652 1726773079.39534: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10652 1726773079.39560: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10652 1726773079.39638: variable '__kernel_settings_register_profile' from source: set_fact 10652 1726773079.39660: Evaluated conditional (not __kernel_settings_register_profile is changed): True 10652 1726773079.39749: variable '__kernel_settings_register_mode' from source: set_fact 10652 1726773079.39761: Evaluated conditional (not __kernel_settings_register_mode is changed): True 10652 1726773079.39837: variable '__kernel_settings_register_apply' from source: set_fact 10652 1726773079.39845: Evaluated conditional (__kernel_settings_register_apply is changed): True 10652 1726773079.39850: variable 'omit' from source: magic vars 10652 1726773079.39878: variable 'omit' from source: magic vars 10652 1726773079.39959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10652 1726773079.41622: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10652 1726773079.41700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10652 1726773079.41735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10652 1726773079.41765: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10652 1726773079.41792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10652 1726773079.41865: variable '__kernel_settings_active_profile' from source: set_fact 10652 1726773079.41900: variable 'omit' from source: magic vars 10652 1726773079.41928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10652 1726773079.41954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10652 1726773079.41974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10652 1726773079.41992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10652 1726773079.42003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10652 1726773079.42033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10652 1726773079.42039: variable 'ansible_host' from source: host vars for 'managed_node2' 10652 1726773079.42044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10652 1726773079.42143: Set connection var ansible_pipelining to False 10652 1726773079.42152: Set connection var ansible_timeout to 10 10652 1726773079.42160: Set connection var ansible_module_compression to ZIP_DEFLATED 10652 1726773079.42163: Set connection var ansible_shell_type to sh 10652 1726773079.42168: Set connection var ansible_shell_executable to /bin/sh 10652 1726773079.42173: Set connection var ansible_connection to ssh 10652 1726773079.42196: variable 'ansible_shell_executable' from source: unknown 10652 1726773079.42201: variable 'ansible_connection' from source: unknown 10652 1726773079.42205: variable 'ansible_module_compression' from source: unknown 10652 1726773079.42208: variable 'ansible_shell_type' from source: unknown 10652 1726773079.42210: variable 'ansible_shell_executable' from source: unknown 10652 1726773079.42213: variable 'ansible_host' from source: host vars for 'managed_node2' 10652 1726773079.42217: variable 'ansible_pipelining' from source: unknown 10652 1726773079.42219: variable 'ansible_timeout' from source: unknown 10652 1726773079.42223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10652 1726773079.42317: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10652 1726773079.42328: variable 'omit' from source: magic vars 10652 1726773079.42334: starting attempt loop 10652 1726773079.42338: running the handler 10652 1726773079.42351: _low_level_execute_command(): starting 10652 1726773079.42358: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10652 1726773079.45092: stdout chunk (state=2): >>>/root <<< 10652 1726773079.45104: stderr chunk (state=2): >>><<< 10652 1726773079.45118: stdout chunk (state=3): >>><<< 10652 1726773079.45134: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10652 1726773079.45150: _low_level_execute_command(): starting 10652 1726773079.45157: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895 `" && echo ansible-tmp-1726773079.4514234-10652-140599872075895="` echo /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895 `" ) && sleep 0' 10652 1726773079.48108: stdout chunk (state=2): >>>ansible-tmp-1726773079.4514234-10652-140599872075895=/root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895 <<< 10652 1726773079.48198: stderr chunk (state=3): >>><<< 10652 1726773079.48205: stdout chunk (state=3): >>><<< 10652 1726773079.48221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773079.4514234-10652-140599872075895=/root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895 , stderr= 10652 1726773079.48247: variable 'ansible_module_compression' from source: unknown 10652 1726773079.48292: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10652 1726773079.48321: variable 'ansible_facts' from source: unknown 10652 1726773079.48401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/AnsiballZ_command.py 10652 1726773079.48500: Sending initial data 10652 1726773079.48507: Sent initial data (155 bytes) 10652 1726773079.50974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpt534caie /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/AnsiballZ_command.py <<< 10652 1726773079.52438: stderr chunk (state=3): >>><<< 10652 1726773079.52446: stdout chunk (state=3): >>><<< 10652 1726773079.52462: done transferring module to remote 10652 1726773079.52470: _low_level_execute_command(): starting 10652 1726773079.52474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/ /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/AnsiballZ_command.py && sleep 0' 10652 1726773079.54837: stderr chunk (state=2): >>><<< 10652 1726773079.54847: stdout chunk (state=2): >>><<< 10652 1726773079.54861: _low_level_execute_command() done: rc=0, stdout=, stderr= 10652 1726773079.54865: _low_level_execute_command(): starting 10652 1726773079.54871: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/AnsiballZ_command.py && sleep 0' 10652 1726773080.87721: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:19.698309", "end": "2024-09-19 15:11:20.875093", "delta": "0:00:01.176784", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10652 1726773080.89016: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10652 1726773080.89067: stderr chunk (state=3): >>><<< 10652 1726773080.89079: stdout chunk (state=3): >>><<< 10652 1726773080.89099: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:19.698309", "end": "2024-09-19 15:11:20.875093", "delta": "0:00:01.176784", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10652 1726773080.89136: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10652 1726773080.89145: _low_level_execute_command(): starting 10652 1726773080.89152: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773079.4514234-10652-140599872075895/ > /dev/null 2>&1 && sleep 0' 10652 1726773080.91955: stderr chunk (state=2): >>><<< 10652 1726773080.91966: stdout chunk (state=2): >>><<< 10652 1726773080.91981: _low_level_execute_command() done: rc=0, stdout=, stderr= 10652 1726773080.91990: handler run complete 10652 1726773080.92008: Evaluated conditional (True): True 10652 1726773080.92017: attempt loop complete, returning result 10652 1726773080.92021: _execute() done 10652 1726773080.92025: dumping result to json 10652 1726773080.92031: done dumping result, returning 10652 1726773080.92038: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-885f-bbcf-000000000313] 10652 1726773080.92044: sending task result for task 0affffe7-6841-885f-bbcf-000000000313 10652 1726773080.92072: done sending task result for task 0affffe7-6841-885f-bbcf-000000000313 10652 1726773080.92076: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.176784", "end": "2024-09-19 15:11:20.875093", "rc": 0, "start": "2024-09-19 15:11:19.698309" } 8240 1726773080.92262: no more pending results, returning what we have 8240 1726773080.92266: results queue empty 8240 1726773080.92266: checking for any_errors_fatal 8240 1726773080.92274: done checking for any_errors_fatal 8240 1726773080.92274: checking for max_fail_percentage 8240 1726773080.92276: done checking for max_fail_percentage 8240 1726773080.92276: checking to see if all hosts have failed and the running result is not ok 8240 1726773080.92277: done checking to see if all hosts have failed 8240 1726773080.92278: getting the remaining hosts for this loop 8240 1726773080.92279: done getting the remaining hosts for this loop 8240 1726773080.92283: getting the next task for host managed_node2 8240 1726773080.92294: done getting next task for host managed_node2 8240 1726773080.92297: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8240 1726773080.92300: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773080.92311: getting variables 8240 1726773080.92312: in VariableManager get_vars() 8240 1726773080.92344: Calling all_inventory to load vars for managed_node2 8240 1726773080.92347: Calling groups_inventory to load vars for managed_node2 8240 1726773080.92348: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773080.92356: Calling all_plugins_play to load vars for managed_node2 8240 1726773080.92358: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773080.92360: Calling groups_plugins_play to load vars for managed_node2 8240 1726773080.92473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773080.92663: done with get_vars() 8240 1726773080.92671: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:20 -0400 (0:00:01.544) 0:00:59.571 **** 8240 1726773080.92745: entering _queue_task() for managed_node2/include_tasks 8240 1726773080.92945: worker is 1 (out of 1 available) 8240 1726773080.92959: exiting _queue_task() for managed_node2/include_tasks 8240 1726773080.92972: done queuing things up, now waiting for results queue to drain 8240 1726773080.92973: waiting for pending results... 10702 1726773080.93215: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 10702 1726773080.93369: in run() - task 0affffe7-6841-885f-bbcf-000000000314 10702 1726773080.93392: variable 'ansible_search_path' from source: unknown 10702 1726773080.93396: variable 'ansible_search_path' from source: unknown 10702 1726773080.93428: calling self._execute() 10702 1726773080.93513: variable 'ansible_host' from source: host vars for 'managed_node2' 10702 1726773080.93523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10702 1726773080.93532: variable 'omit' from source: magic vars 10702 1726773080.93937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10702 1726773080.94131: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10702 1726773080.94168: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10702 1726773080.94202: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10702 1726773080.94229: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10702 1726773080.94314: variable '__kernel_settings_register_apply' from source: set_fact 10702 1726773080.94337: Evaluated conditional (__kernel_settings_register_apply is changed): True 10702 1726773080.94344: _execute() done 10702 1726773080.94349: dumping result to json 10702 1726773080.94353: done dumping result, returning 10702 1726773080.94360: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-885f-bbcf-000000000314] 10702 1726773080.94366: sending task result for task 0affffe7-6841-885f-bbcf-000000000314 10702 1726773080.94395: done sending task result for task 0affffe7-6841-885f-bbcf-000000000314 10702 1726773080.94398: WORKER PROCESS EXITING 8240 1726773080.94511: no more pending results, returning what we have 8240 1726773080.94516: in VariableManager get_vars() 8240 1726773080.94555: Calling all_inventory to load vars for managed_node2 8240 1726773080.94557: Calling groups_inventory to load vars for managed_node2 8240 1726773080.94559: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773080.94569: Calling all_plugins_play to load vars for managed_node2 8240 1726773080.94572: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773080.94574: Calling groups_plugins_play to load vars for managed_node2 8240 1726773080.94703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773080.94821: done with get_vars() 8240 1726773080.94827: variable 'ansible_search_path' from source: unknown 8240 1726773080.94827: variable 'ansible_search_path' from source: unknown 8240 1726773080.94851: we have included files to process 8240 1726773080.94851: generating all_blocks data 8240 1726773080.94854: done generating all_blocks data 8240 1726773080.94858: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773080.94858: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773080.94860: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8240 1726773080.95126: done processing included file 8240 1726773080.95128: iterating over new_blocks loaded from include file 8240 1726773080.95129: in VariableManager get_vars() 8240 1726773080.95145: done with get_vars() 8240 1726773080.95146: filtering new block on tags 8240 1726773080.95178: done filtering new block on tags 8240 1726773080.95180: done iterating over new_blocks loaded from include file 8240 1726773080.95180: extending task lists for all hosts with included blocks 8240 1726773080.95590: done extending task lists 8240 1726773080.95591: done processing included files 8240 1726773080.95592: results queue empty 8240 1726773080.95592: checking for any_errors_fatal 8240 1726773080.95596: done checking for any_errors_fatal 8240 1726773080.95596: checking for max_fail_percentage 8240 1726773080.95597: done checking for max_fail_percentage 8240 1726773080.95597: checking to see if all hosts have failed and the running result is not ok 8240 1726773080.95598: done checking to see if all hosts have failed 8240 1726773080.95598: getting the remaining hosts for this loop 8240 1726773080.95599: done getting the remaining hosts for this loop 8240 1726773080.95601: getting the next task for host managed_node2 8240 1726773080.95604: done getting next task for host managed_node2 8240 1726773080.95605: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8240 1726773080.95607: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773080.95614: getting variables 8240 1726773080.95614: in VariableManager get_vars() 8240 1726773080.95623: Calling all_inventory to load vars for managed_node2 8240 1726773080.95624: Calling groups_inventory to load vars for managed_node2 8240 1726773080.95625: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773080.95629: Calling all_plugins_play to load vars for managed_node2 8240 1726773080.95630: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773080.95631: Calling groups_plugins_play to load vars for managed_node2 8240 1726773080.95711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773080.95819: done with get_vars() 8240 1726773080.95826: done getting variables 8240 1726773080.95851: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:20 -0400 (0:00:00.031) 0:00:59.602 **** 8240 1726773080.95873: entering _queue_task() for managed_node2/command 8240 1726773080.96054: worker is 1 (out of 1 available) 8240 1726773080.96068: exiting _queue_task() for managed_node2/command 8240 1726773080.96080: done queuing things up, now waiting for results queue to drain 8240 1726773080.96082: waiting for pending results... 10704 1726773080.96217: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 10704 1726773080.96347: in run() - task 0affffe7-6841-885f-bbcf-000000000483 10704 1726773080.96364: variable 'ansible_search_path' from source: unknown 10704 1726773080.96368: variable 'ansible_search_path' from source: unknown 10704 1726773080.96401: calling self._execute() 10704 1726773080.96470: variable 'ansible_host' from source: host vars for 'managed_node2' 10704 1726773080.96479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10704 1726773080.96491: variable 'omit' from source: magic vars 10704 1726773080.96564: variable 'omit' from source: magic vars 10704 1726773080.96612: variable 'omit' from source: magic vars 10704 1726773080.96636: variable 'omit' from source: magic vars 10704 1726773080.96671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10704 1726773080.96704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10704 1726773080.96724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10704 1726773080.96740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10704 1726773080.96751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10704 1726773080.96776: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10704 1726773080.96782: variable 'ansible_host' from source: host vars for 'managed_node2' 10704 1726773080.96788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10704 1726773080.96862: Set connection var ansible_pipelining to False 10704 1726773080.96869: Set connection var ansible_timeout to 10 10704 1726773080.96877: Set connection var ansible_module_compression to ZIP_DEFLATED 10704 1726773080.96881: Set connection var ansible_shell_type to sh 10704 1726773080.96888: Set connection var ansible_shell_executable to /bin/sh 10704 1726773080.96895: Set connection var ansible_connection to ssh 10704 1726773080.96911: variable 'ansible_shell_executable' from source: unknown 10704 1726773080.96915: variable 'ansible_connection' from source: unknown 10704 1726773080.96920: variable 'ansible_module_compression' from source: unknown 10704 1726773080.96923: variable 'ansible_shell_type' from source: unknown 10704 1726773080.96927: variable 'ansible_shell_executable' from source: unknown 10704 1726773080.96930: variable 'ansible_host' from source: host vars for 'managed_node2' 10704 1726773080.96934: variable 'ansible_pipelining' from source: unknown 10704 1726773080.96937: variable 'ansible_timeout' from source: unknown 10704 1726773080.96941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10704 1726773080.97040: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10704 1726773080.97054: variable 'omit' from source: magic vars 10704 1726773080.97060: starting attempt loop 10704 1726773080.97063: running the handler 10704 1726773080.97076: _low_level_execute_command(): starting 10704 1726773080.97082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10704 1726773080.99443: stdout chunk (state=2): >>>/root <<< 10704 1726773080.99562: stderr chunk (state=3): >>><<< 10704 1726773080.99570: stdout chunk (state=3): >>><<< 10704 1726773080.99594: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10704 1726773080.99610: _low_level_execute_command(): starting 10704 1726773080.99616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182 `" && echo ansible-tmp-1726773080.9960296-10704-13206854958182="` echo /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182 `" ) && sleep 0' 10704 1726773081.02187: stdout chunk (state=2): >>>ansible-tmp-1726773080.9960296-10704-13206854958182=/root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182 <<< 10704 1726773081.02319: stderr chunk (state=3): >>><<< 10704 1726773081.02327: stdout chunk (state=3): >>><<< 10704 1726773081.02345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773080.9960296-10704-13206854958182=/root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182 , stderr= 10704 1726773081.02374: variable 'ansible_module_compression' from source: unknown 10704 1726773081.02423: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10704 1726773081.02456: variable 'ansible_facts' from source: unknown 10704 1726773081.02534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/AnsiballZ_command.py 10704 1726773081.02644: Sending initial data 10704 1726773081.02651: Sent initial data (154 bytes) 10704 1726773081.05238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmputij1mad /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/AnsiballZ_command.py <<< 10704 1726773081.06358: stderr chunk (state=3): >>><<< 10704 1726773081.06369: stdout chunk (state=3): >>><<< 10704 1726773081.06392: done transferring module to remote 10704 1726773081.06405: _low_level_execute_command(): starting 10704 1726773081.06411: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/ /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/AnsiballZ_command.py && sleep 0' 10704 1726773081.08857: stderr chunk (state=2): >>><<< 10704 1726773081.08870: stdout chunk (state=2): >>><<< 10704 1726773081.08891: _low_level_execute_command() done: rc=0, stdout=, stderr= 10704 1726773081.08897: _low_level_execute_command(): starting 10704 1726773081.08902: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/AnsiballZ_command.py && sleep 0' 10704 1726773081.34729: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:21.240235", "end": "2024-09-19 15:11:21.345192", "delta": "0:00:00.104957", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10704 1726773081.35943: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10704 1726773081.35995: stderr chunk (state=3): >>><<< 10704 1726773081.36002: stdout chunk (state=3): >>><<< 10704 1726773081.36019: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:21.240235", "end": "2024-09-19 15:11:21.345192", "delta": "0:00:00.104957", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10704 1726773081.36061: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10704 1726773081.36072: _low_level_execute_command(): starting 10704 1726773081.36080: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773080.9960296-10704-13206854958182/ > /dev/null 2>&1 && sleep 0' 10704 1726773081.38516: stderr chunk (state=2): >>><<< 10704 1726773081.38526: stdout chunk (state=2): >>><<< 10704 1726773081.38544: _low_level_execute_command() done: rc=0, stdout=, stderr= 10704 1726773081.38552: handler run complete 10704 1726773081.38571: Evaluated conditional (False): False 10704 1726773081.38580: attempt loop complete, returning result 10704 1726773081.38584: _execute() done 10704 1726773081.38588: dumping result to json 10704 1726773081.38596: done dumping result, returning 10704 1726773081.38605: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-885f-bbcf-000000000483] 10704 1726773081.38613: sending task result for task 0affffe7-6841-885f-bbcf-000000000483 10704 1726773081.38643: done sending task result for task 0affffe7-6841-885f-bbcf-000000000483 10704 1726773081.38647: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.104957", "end": "2024-09-19 15:11:21.345192", "rc": 0, "start": "2024-09-19 15:11:21.240235" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773081.38811: no more pending results, returning what we have 8240 1726773081.38815: results queue empty 8240 1726773081.38816: checking for any_errors_fatal 8240 1726773081.38818: done checking for any_errors_fatal 8240 1726773081.38818: checking for max_fail_percentage 8240 1726773081.38820: done checking for max_fail_percentage 8240 1726773081.38820: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.38821: done checking to see if all hosts have failed 8240 1726773081.38822: getting the remaining hosts for this loop 8240 1726773081.38823: done getting the remaining hosts for this loop 8240 1726773081.38827: getting the next task for host managed_node2 8240 1726773081.38833: done getting next task for host managed_node2 8240 1726773081.38836: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8240 1726773081.38839: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.38849: getting variables 8240 1726773081.38850: in VariableManager get_vars() 8240 1726773081.38887: Calling all_inventory to load vars for managed_node2 8240 1726773081.38890: Calling groups_inventory to load vars for managed_node2 8240 1726773081.38891: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.38901: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.38903: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.38905: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.39015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.39165: done with get_vars() 8240 1726773081.39174: done getting variables 8240 1726773081.39223: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.433) 0:01:00.036 **** 8240 1726773081.39248: entering _queue_task() for managed_node2/shell 8240 1726773081.39421: worker is 1 (out of 1 available) 8240 1726773081.39436: exiting _queue_task() for managed_node2/shell 8240 1726773081.39449: done queuing things up, now waiting for results queue to drain 8240 1726773081.39451: waiting for pending results... 10715 1726773081.39581: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 10715 1726773081.39718: in run() - task 0affffe7-6841-885f-bbcf-000000000484 10715 1726773081.39734: variable 'ansible_search_path' from source: unknown 10715 1726773081.39738: variable 'ansible_search_path' from source: unknown 10715 1726773081.39766: calling self._execute() 10715 1726773081.39837: variable 'ansible_host' from source: host vars for 'managed_node2' 10715 1726773081.39846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10715 1726773081.39854: variable 'omit' from source: magic vars 10715 1726773081.40191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10715 1726773081.40374: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10715 1726773081.40411: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10715 1726773081.40439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10715 1726773081.40465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10715 1726773081.40548: variable '__kernel_settings_register_verify_values' from source: set_fact 10715 1726773081.40572: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10715 1726773081.40578: when evaluation is False, skipping this task 10715 1726773081.40582: _execute() done 10715 1726773081.40588: dumping result to json 10715 1726773081.40592: done dumping result, returning 10715 1726773081.40600: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-885f-bbcf-000000000484] 10715 1726773081.40606: sending task result for task 0affffe7-6841-885f-bbcf-000000000484 10715 1726773081.40629: done sending task result for task 0affffe7-6841-885f-bbcf-000000000484 10715 1726773081.40632: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773081.40742: no more pending results, returning what we have 8240 1726773081.40746: results queue empty 8240 1726773081.40747: checking for any_errors_fatal 8240 1726773081.40755: done checking for any_errors_fatal 8240 1726773081.40756: checking for max_fail_percentage 8240 1726773081.40757: done checking for max_fail_percentage 8240 1726773081.40758: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.40759: done checking to see if all hosts have failed 8240 1726773081.40759: getting the remaining hosts for this loop 8240 1726773081.40760: done getting the remaining hosts for this loop 8240 1726773081.40765: getting the next task for host managed_node2 8240 1726773081.40771: done getting next task for host managed_node2 8240 1726773081.40774: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8240 1726773081.40778: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.40794: getting variables 8240 1726773081.40796: in VariableManager get_vars() 8240 1726773081.40830: Calling all_inventory to load vars for managed_node2 8240 1726773081.40832: Calling groups_inventory to load vars for managed_node2 8240 1726773081.40834: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.40843: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.40845: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.40847: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.40955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.41076: done with get_vars() 8240 1726773081.41084: done getting variables 8240 1726773081.41130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.019) 0:01:00.055 **** 8240 1726773081.41154: entering _queue_task() for managed_node2/fail 8240 1726773081.41321: worker is 1 (out of 1 available) 8240 1726773081.41334: exiting _queue_task() for managed_node2/fail 8240 1726773081.41346: done queuing things up, now waiting for results queue to drain 8240 1726773081.41348: waiting for pending results... 10716 1726773081.41476: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 10716 1726773081.41608: in run() - task 0affffe7-6841-885f-bbcf-000000000485 10716 1726773081.41623: variable 'ansible_search_path' from source: unknown 10716 1726773081.41627: variable 'ansible_search_path' from source: unknown 10716 1726773081.41653: calling self._execute() 10716 1726773081.41722: variable 'ansible_host' from source: host vars for 'managed_node2' 10716 1726773081.41731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10716 1726773081.41740: variable 'omit' from source: magic vars 10716 1726773081.42066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10716 1726773081.42302: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10716 1726773081.42339: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10716 1726773081.42367: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10716 1726773081.42395: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10716 1726773081.42472: variable '__kernel_settings_register_verify_values' from source: set_fact 10716 1726773081.42498: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 10716 1726773081.42503: when evaluation is False, skipping this task 10716 1726773081.42507: _execute() done 10716 1726773081.42513: dumping result to json 10716 1726773081.42518: done dumping result, returning 10716 1726773081.42523: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-885f-bbcf-000000000485] 10716 1726773081.42529: sending task result for task 0affffe7-6841-885f-bbcf-000000000485 10716 1726773081.42553: done sending task result for task 0affffe7-6841-885f-bbcf-000000000485 10716 1726773081.42557: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773081.42696: no more pending results, returning what we have 8240 1726773081.42700: results queue empty 8240 1726773081.42700: checking for any_errors_fatal 8240 1726773081.42706: done checking for any_errors_fatal 8240 1726773081.42706: checking for max_fail_percentage 8240 1726773081.42708: done checking for max_fail_percentage 8240 1726773081.42708: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.42709: done checking to see if all hosts have failed 8240 1726773081.42710: getting the remaining hosts for this loop 8240 1726773081.42711: done getting the remaining hosts for this loop 8240 1726773081.42715: getting the next task for host managed_node2 8240 1726773081.42722: done getting next task for host managed_node2 8240 1726773081.42725: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8240 1726773081.42728: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.42743: getting variables 8240 1726773081.42744: in VariableManager get_vars() 8240 1726773081.42770: Calling all_inventory to load vars for managed_node2 8240 1726773081.42772: Calling groups_inventory to load vars for managed_node2 8240 1726773081.42773: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.42780: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.42782: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.42784: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.42892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.43050: done with get_vars() 8240 1726773081.43057: done getting variables 8240 1726773081.43103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.019) 0:01:00.075 **** 8240 1726773081.43127: entering _queue_task() for managed_node2/set_fact 8240 1726773081.43296: worker is 1 (out of 1 available) 8240 1726773081.43310: exiting _queue_task() for managed_node2/set_fact 8240 1726773081.43323: done queuing things up, now waiting for results queue to drain 8240 1726773081.43324: waiting for pending results... 10717 1726773081.43444: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 10717 1726773081.43562: in run() - task 0affffe7-6841-885f-bbcf-000000000315 10717 1726773081.43577: variable 'ansible_search_path' from source: unknown 10717 1726773081.43581: variable 'ansible_search_path' from source: unknown 10717 1726773081.43610: calling self._execute() 10717 1726773081.43677: variable 'ansible_host' from source: host vars for 'managed_node2' 10717 1726773081.43686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10717 1726773081.43695: variable 'omit' from source: magic vars 10717 1726773081.43768: variable 'omit' from source: magic vars 10717 1726773081.43803: variable 'omit' from source: magic vars 10717 1726773081.43826: variable 'omit' from source: magic vars 10717 1726773081.43859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10717 1726773081.43889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10717 1726773081.43908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10717 1726773081.43923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10717 1726773081.43933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10717 1726773081.43957: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10717 1726773081.43962: variable 'ansible_host' from source: host vars for 'managed_node2' 10717 1726773081.43967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10717 1726773081.44038: Set connection var ansible_pipelining to False 10717 1726773081.44046: Set connection var ansible_timeout to 10 10717 1726773081.44054: Set connection var ansible_module_compression to ZIP_DEFLATED 10717 1726773081.44057: Set connection var ansible_shell_type to sh 10717 1726773081.44062: Set connection var ansible_shell_executable to /bin/sh 10717 1726773081.44067: Set connection var ansible_connection to ssh 10717 1726773081.44081: variable 'ansible_shell_executable' from source: unknown 10717 1726773081.44087: variable 'ansible_connection' from source: unknown 10717 1726773081.44091: variable 'ansible_module_compression' from source: unknown 10717 1726773081.44094: variable 'ansible_shell_type' from source: unknown 10717 1726773081.44097: variable 'ansible_shell_executable' from source: unknown 10717 1726773081.44100: variable 'ansible_host' from source: host vars for 'managed_node2' 10717 1726773081.44104: variable 'ansible_pipelining' from source: unknown 10717 1726773081.44107: variable 'ansible_timeout' from source: unknown 10717 1726773081.44109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10717 1726773081.44203: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10717 1726773081.44217: variable 'omit' from source: magic vars 10717 1726773081.44223: starting attempt loop 10717 1726773081.44227: running the handler 10717 1726773081.44236: handler run complete 10717 1726773081.44246: attempt loop complete, returning result 10717 1726773081.44249: _execute() done 10717 1726773081.44252: dumping result to json 10717 1726773081.44255: done dumping result, returning 10717 1726773081.44262: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000315] 10717 1726773081.44268: sending task result for task 0affffe7-6841-885f-bbcf-000000000315 10717 1726773081.44290: done sending task result for task 0affffe7-6841-885f-bbcf-000000000315 10717 1726773081.44293: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8240 1726773081.44418: no more pending results, returning what we have 8240 1726773081.44421: results queue empty 8240 1726773081.44422: checking for any_errors_fatal 8240 1726773081.44428: done checking for any_errors_fatal 8240 1726773081.44429: checking for max_fail_percentage 8240 1726773081.44430: done checking for max_fail_percentage 8240 1726773081.44431: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.44432: done checking to see if all hosts have failed 8240 1726773081.44432: getting the remaining hosts for this loop 8240 1726773081.44433: done getting the remaining hosts for this loop 8240 1726773081.44437: getting the next task for host managed_node2 8240 1726773081.44442: done getting next task for host managed_node2 8240 1726773081.44445: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8240 1726773081.44447: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.44456: getting variables 8240 1726773081.44457: in VariableManager get_vars() 8240 1726773081.44487: Calling all_inventory to load vars for managed_node2 8240 1726773081.44490: Calling groups_inventory to load vars for managed_node2 8240 1726773081.44492: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.44501: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.44503: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.44505: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.44610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.44727: done with get_vars() 8240 1726773081.44733: done getting variables 8240 1726773081.44774: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.016) 0:01:00.091 **** 8240 1726773081.44799: entering _queue_task() for managed_node2/set_fact 8240 1726773081.44954: worker is 1 (out of 1 available) 8240 1726773081.44968: exiting _queue_task() for managed_node2/set_fact 8240 1726773081.44982: done queuing things up, now waiting for results queue to drain 8240 1726773081.44983: waiting for pending results... 10718 1726773081.45103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 10718 1726773081.45215: in run() - task 0affffe7-6841-885f-bbcf-000000000316 10718 1726773081.45230: variable 'ansible_search_path' from source: unknown 10718 1726773081.45234: variable 'ansible_search_path' from source: unknown 10718 1726773081.45259: calling self._execute() 10718 1726773081.45328: variable 'ansible_host' from source: host vars for 'managed_node2' 10718 1726773081.45337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10718 1726773081.45345: variable 'omit' from source: magic vars 10718 1726773081.45415: variable 'omit' from source: magic vars 10718 1726773081.45449: variable 'omit' from source: magic vars 10718 1726773081.45708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10718 1726773081.45941: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10718 1726773081.45977: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10718 1726773081.46005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10718 1726773081.46031: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10718 1726773081.46130: variable '__kernel_settings_register_profile' from source: set_fact 10718 1726773081.46143: variable '__kernel_settings_register_mode' from source: set_fact 10718 1726773081.46152: variable '__kernel_settings_register_apply' from source: set_fact 10718 1726773081.46191: variable 'omit' from source: magic vars 10718 1726773081.46211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10718 1726773081.46231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10718 1726773081.46244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10718 1726773081.46254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10718 1726773081.46261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10718 1726773081.46282: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10718 1726773081.46297: variable 'ansible_host' from source: host vars for 'managed_node2' 10718 1726773081.46302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10718 1726773081.46367: Set connection var ansible_pipelining to False 10718 1726773081.46375: Set connection var ansible_timeout to 10 10718 1726773081.46382: Set connection var ansible_module_compression to ZIP_DEFLATED 10718 1726773081.46387: Set connection var ansible_shell_type to sh 10718 1726773081.46392: Set connection var ansible_shell_executable to /bin/sh 10718 1726773081.46398: Set connection var ansible_connection to ssh 10718 1726773081.46414: variable 'ansible_shell_executable' from source: unknown 10718 1726773081.46418: variable 'ansible_connection' from source: unknown 10718 1726773081.46422: variable 'ansible_module_compression' from source: unknown 10718 1726773081.46425: variable 'ansible_shell_type' from source: unknown 10718 1726773081.46428: variable 'ansible_shell_executable' from source: unknown 10718 1726773081.46432: variable 'ansible_host' from source: host vars for 'managed_node2' 10718 1726773081.46436: variable 'ansible_pipelining' from source: unknown 10718 1726773081.46439: variable 'ansible_timeout' from source: unknown 10718 1726773081.46444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10718 1726773081.46515: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10718 1726773081.46526: variable 'omit' from source: magic vars 10718 1726773081.46532: starting attempt loop 10718 1726773081.46535: running the handler 10718 1726773081.46545: handler run complete 10718 1726773081.46553: attempt loop complete, returning result 10718 1726773081.46556: _execute() done 10718 1726773081.46559: dumping result to json 10718 1726773081.46563: done dumping result, returning 10718 1726773081.46569: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-885f-bbcf-000000000316] 10718 1726773081.46575: sending task result for task 0affffe7-6841-885f-bbcf-000000000316 10718 1726773081.46595: done sending task result for task 0affffe7-6841-885f-bbcf-000000000316 10718 1726773081.46598: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8240 1726773081.46722: no more pending results, returning what we have 8240 1726773081.46725: results queue empty 8240 1726773081.46726: checking for any_errors_fatal 8240 1726773081.46731: done checking for any_errors_fatal 8240 1726773081.46732: checking for max_fail_percentage 8240 1726773081.46733: done checking for max_fail_percentage 8240 1726773081.46734: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.46734: done checking to see if all hosts have failed 8240 1726773081.46735: getting the remaining hosts for this loop 8240 1726773081.46736: done getting the remaining hosts for this loop 8240 1726773081.46739: getting the next task for host managed_node2 8240 1726773081.46748: done getting next task for host managed_node2 8240 1726773081.46749: ^ task is: TASK: meta (role_complete) 8240 1726773081.46752: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.46761: getting variables 8240 1726773081.46763: in VariableManager get_vars() 8240 1726773081.46801: Calling all_inventory to load vars for managed_node2 8240 1726773081.46804: Calling groups_inventory to load vars for managed_node2 8240 1726773081.46806: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.46815: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.46817: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.46819: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.46925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.47079: done with get_vars() 8240 1726773081.47088: done getting variables 8240 1726773081.47142: done queuing things up, now waiting for results queue to drain 8240 1726773081.47144: results queue empty 8240 1726773081.47144: checking for any_errors_fatal 8240 1726773081.47147: done checking for any_errors_fatal 8240 1726773081.47147: checking for max_fail_percentage 8240 1726773081.47148: done checking for max_fail_percentage 8240 1726773081.47152: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.47152: done checking to see if all hosts have failed 8240 1726773081.47153: getting the remaining hosts for this loop 8240 1726773081.47153: done getting the remaining hosts for this loop 8240 1726773081.47155: getting the next task for host managed_node2 8240 1726773081.47157: done getting next task for host managed_node2 8240 1726773081.47158: ^ task is: TASK: meta (flush_handlers) 8240 1726773081.47159: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.47161: getting variables 8240 1726773081.47162: in VariableManager get_vars() 8240 1726773081.47170: Calling all_inventory to load vars for managed_node2 8240 1726773081.47171: Calling groups_inventory to load vars for managed_node2 8240 1726773081.47173: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.47176: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.47177: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.47179: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.47257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.47362: done with get_vars() 8240 1726773081.47368: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:130 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.026) 0:01:00.117 **** 8240 1726773081.47415: in VariableManager get_vars() 8240 1726773081.47422: Calling all_inventory to load vars for managed_node2 8240 1726773081.47423: Calling groups_inventory to load vars for managed_node2 8240 1726773081.47424: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.47427: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.47428: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.47430: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.47509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.47612: done with get_vars() META: triggered running handlers for managed_node2 8240 1726773081.47622: done queuing things up, now waiting for results queue to drain 8240 1726773081.47623: results queue empty 8240 1726773081.47623: checking for any_errors_fatal 8240 1726773081.47625: done checking for any_errors_fatal 8240 1726773081.47625: checking for max_fail_percentage 8240 1726773081.47626: done checking for max_fail_percentage 8240 1726773081.47626: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.47626: done checking to see if all hosts have failed 8240 1726773081.47627: getting the remaining hosts for this loop 8240 1726773081.47627: done getting the remaining hosts for this loop 8240 1726773081.47628: getting the next task for host managed_node2 8240 1726773081.47631: done getting next task for host managed_node2 8240 1726773081.47632: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8240 1726773081.47633: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.47634: getting variables 8240 1726773081.47635: in VariableManager get_vars() 8240 1726773081.47641: Calling all_inventory to load vars for managed_node2 8240 1726773081.47642: Calling groups_inventory to load vars for managed_node2 8240 1726773081.47643: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.47646: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.47647: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.47648: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.47746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.47845: done with get_vars() 8240 1726773081.47851: done getting variables 8240 1726773081.47875: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:133 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.004) 0:01:00.122 **** 8240 1726773081.47890: entering _queue_task() for managed_node2/assert 8240 1726773081.48058: worker is 1 (out of 1 available) 8240 1726773081.48072: exiting _queue_task() for managed_node2/assert 8240 1726773081.48088: done queuing things up, now waiting for results queue to drain 8240 1726773081.48091: waiting for pending results... 10719 1726773081.48214: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 10719 1726773081.48314: in run() - task 0affffe7-6841-885f-bbcf-00000000001c 10719 1726773081.48332: variable 'ansible_search_path' from source: unknown 10719 1726773081.48360: calling self._execute() 10719 1726773081.48429: variable 'ansible_host' from source: host vars for 'managed_node2' 10719 1726773081.48438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10719 1726773081.48446: variable 'omit' from source: magic vars 10719 1726773081.48520: variable 'omit' from source: magic vars 10719 1726773081.48543: variable 'omit' from source: magic vars 10719 1726773081.48565: variable 'omit' from source: magic vars 10719 1726773081.48597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10719 1726773081.48620: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10719 1726773081.48635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10719 1726773081.48648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10719 1726773081.48658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10719 1726773081.48679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10719 1726773081.48683: variable 'ansible_host' from source: host vars for 'managed_node2' 10719 1726773081.48688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10719 1726773081.48753: Set connection var ansible_pipelining to False 10719 1726773081.48759: Set connection var ansible_timeout to 10 10719 1726773081.48764: Set connection var ansible_module_compression to ZIP_DEFLATED 10719 1726773081.48767: Set connection var ansible_shell_type to sh 10719 1726773081.48771: Set connection var ansible_shell_executable to /bin/sh 10719 1726773081.48774: Set connection var ansible_connection to ssh 10719 1726773081.48789: variable 'ansible_shell_executable' from source: unknown 10719 1726773081.48794: variable 'ansible_connection' from source: unknown 10719 1726773081.48797: variable 'ansible_module_compression' from source: unknown 10719 1726773081.48801: variable 'ansible_shell_type' from source: unknown 10719 1726773081.48804: variable 'ansible_shell_executable' from source: unknown 10719 1726773081.48807: variable 'ansible_host' from source: host vars for 'managed_node2' 10719 1726773081.48811: variable 'ansible_pipelining' from source: unknown 10719 1726773081.48814: variable 'ansible_timeout' from source: unknown 10719 1726773081.48819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10719 1726773081.48916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10719 1726773081.48928: variable 'omit' from source: magic vars 10719 1726773081.48934: starting attempt loop 10719 1726773081.48937: running the handler 10719 1726773081.49181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10719 1726773081.50882: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10719 1726773081.50931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10719 1726773081.50960: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10719 1726773081.50989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10719 1726773081.51010: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10719 1726773081.51060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10719 1726773081.51080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10719 1726773081.51101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10719 1726773081.51128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10719 1726773081.51138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10719 1726773081.51224: variable 'kernel_settings_reboot_required' from source: set_fact 10719 1726773081.51241: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 10719 1726773081.51249: handler run complete 10719 1726773081.51266: attempt loop complete, returning result 10719 1726773081.51270: _execute() done 10719 1726773081.51273: dumping result to json 10719 1726773081.51277: done dumping result, returning 10719 1726773081.51284: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [0affffe7-6841-885f-bbcf-00000000001c] 10719 1726773081.51291: sending task result for task 0affffe7-6841-885f-bbcf-00000000001c 10719 1726773081.51315: done sending task result for task 0affffe7-6841-885f-bbcf-00000000001c 10719 1726773081.51318: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773081.51434: no more pending results, returning what we have 8240 1726773081.51437: results queue empty 8240 1726773081.51438: checking for any_errors_fatal 8240 1726773081.51440: done checking for any_errors_fatal 8240 1726773081.51441: checking for max_fail_percentage 8240 1726773081.51442: done checking for max_fail_percentage 8240 1726773081.51443: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.51444: done checking to see if all hosts have failed 8240 1726773081.51444: getting the remaining hosts for this loop 8240 1726773081.51445: done getting the remaining hosts for this loop 8240 1726773081.51449: getting the next task for host managed_node2 8240 1726773081.51454: done getting next task for host managed_node2 8240 1726773081.51456: ^ task is: TASK: Ensure role reported changed 8240 1726773081.51458: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.51461: getting variables 8240 1726773081.51462: in VariableManager get_vars() 8240 1726773081.51499: Calling all_inventory to load vars for managed_node2 8240 1726773081.51503: Calling groups_inventory to load vars for managed_node2 8240 1726773081.51505: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.51515: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.51523: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.51526: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.51645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.51755: done with get_vars() 8240 1726773081.51762: done getting variables 8240 1726773081.51809: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:137 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.039) 0:01:00.162 **** 8240 1726773081.51831: entering _queue_task() for managed_node2/assert 8240 1726773081.51990: worker is 1 (out of 1 available) 8240 1726773081.52008: exiting _queue_task() for managed_node2/assert 8240 1726773081.52021: done queuing things up, now waiting for results queue to drain 8240 1726773081.52022: waiting for pending results... 10720 1726773081.52142: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 10720 1726773081.52242: in run() - task 0affffe7-6841-885f-bbcf-00000000001d 10720 1726773081.52258: variable 'ansible_search_path' from source: unknown 10720 1726773081.52288: calling self._execute() 10720 1726773081.52357: variable 'ansible_host' from source: host vars for 'managed_node2' 10720 1726773081.52627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10720 1726773081.52637: variable 'omit' from source: magic vars 10720 1726773081.52709: variable 'omit' from source: magic vars 10720 1726773081.52733: variable 'omit' from source: magic vars 10720 1726773081.52755: variable 'omit' from source: magic vars 10720 1726773081.52787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10720 1726773081.52813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10720 1726773081.52831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10720 1726773081.52846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10720 1726773081.52857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10720 1726773081.52880: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10720 1726773081.52886: variable 'ansible_host' from source: host vars for 'managed_node2' 10720 1726773081.52891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10720 1726773081.52959: Set connection var ansible_pipelining to False 10720 1726773081.52966: Set connection var ansible_timeout to 10 10720 1726773081.52974: Set connection var ansible_module_compression to ZIP_DEFLATED 10720 1726773081.52977: Set connection var ansible_shell_type to sh 10720 1726773081.52982: Set connection var ansible_shell_executable to /bin/sh 10720 1726773081.52989: Set connection var ansible_connection to ssh 10720 1726773081.53005: variable 'ansible_shell_executable' from source: unknown 10720 1726773081.53010: variable 'ansible_connection' from source: unknown 10720 1726773081.53012: variable 'ansible_module_compression' from source: unknown 10720 1726773081.53014: variable 'ansible_shell_type' from source: unknown 10720 1726773081.53016: variable 'ansible_shell_executable' from source: unknown 10720 1726773081.53017: variable 'ansible_host' from source: host vars for 'managed_node2' 10720 1726773081.53019: variable 'ansible_pipelining' from source: unknown 10720 1726773081.53021: variable 'ansible_timeout' from source: unknown 10720 1726773081.53023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10720 1726773081.53119: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10720 1726773081.53131: variable 'omit' from source: magic vars 10720 1726773081.53136: starting attempt loop 10720 1726773081.53139: running the handler 10720 1726773081.53371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10720 1726773081.54863: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10720 1726773081.54917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10720 1726773081.54945: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10720 1726773081.54972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10720 1726773081.54994: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10720 1726773081.55041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10720 1726773081.55058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10720 1726773081.55074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10720 1726773081.55112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10720 1726773081.55125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10720 1726773081.55203: variable '__kernel_settings_changed' from source: set_fact 10720 1726773081.55219: Evaluated conditional (__kernel_settings_changed | d(false)): True 10720 1726773081.55227: handler run complete 10720 1726773081.55243: attempt loop complete, returning result 10720 1726773081.55247: _execute() done 10720 1726773081.55250: dumping result to json 10720 1726773081.55254: done dumping result, returning 10720 1726773081.55261: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [0affffe7-6841-885f-bbcf-00000000001d] 10720 1726773081.55266: sending task result for task 0affffe7-6841-885f-bbcf-00000000001d 10720 1726773081.55290: done sending task result for task 0affffe7-6841-885f-bbcf-00000000001d 10720 1726773081.55294: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773081.55429: no more pending results, returning what we have 8240 1726773081.55432: results queue empty 8240 1726773081.55433: checking for any_errors_fatal 8240 1726773081.55440: done checking for any_errors_fatal 8240 1726773081.55440: checking for max_fail_percentage 8240 1726773081.55442: done checking for max_fail_percentage 8240 1726773081.55442: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.55443: done checking to see if all hosts have failed 8240 1726773081.55444: getting the remaining hosts for this loop 8240 1726773081.55445: done getting the remaining hosts for this loop 8240 1726773081.55448: getting the next task for host managed_node2 8240 1726773081.55454: done getting next task for host managed_node2 8240 1726773081.55456: ^ task is: TASK: Check sysctl after reboot 8240 1726773081.55457: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.55460: getting variables 8240 1726773081.55462: in VariableManager get_vars() 8240 1726773081.55720: Calling all_inventory to load vars for managed_node2 8240 1726773081.55722: Calling groups_inventory to load vars for managed_node2 8240 1726773081.55723: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.55731: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.55733: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.55741: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.55835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.55942: done with get_vars() 8240 1726773081.55949: done getting variables 8240 1726773081.55993: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl after reboot] *********************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:141 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.041) 0:01:00.204 **** 8240 1726773081.56015: entering _queue_task() for managed_node2/shell 8240 1726773081.56178: worker is 1 (out of 1 available) 8240 1726773081.56193: exiting _queue_task() for managed_node2/shell 8240 1726773081.56208: done queuing things up, now waiting for results queue to drain 8240 1726773081.56210: waiting for pending results... 10721 1726773081.56330: running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot 10721 1726773081.56435: in run() - task 0affffe7-6841-885f-bbcf-00000000001e 10721 1726773081.56451: variable 'ansible_search_path' from source: unknown 10721 1726773081.56479: calling self._execute() 10721 1726773081.56548: variable 'ansible_host' from source: host vars for 'managed_node2' 10721 1726773081.56557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10721 1726773081.56566: variable 'omit' from source: magic vars 10721 1726773081.56646: variable 'omit' from source: magic vars 10721 1726773081.56672: variable 'omit' from source: magic vars 10721 1726773081.56698: variable 'omit' from source: magic vars 10721 1726773081.56731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10721 1726773081.56759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10721 1726773081.56779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10721 1726773081.56795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10721 1726773081.56807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10721 1726773081.56831: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10721 1726773081.56837: variable 'ansible_host' from source: host vars for 'managed_node2' 10721 1726773081.56841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10721 1726773081.56914: Set connection var ansible_pipelining to False 10721 1726773081.56921: Set connection var ansible_timeout to 10 10721 1726773081.56929: Set connection var ansible_module_compression to ZIP_DEFLATED 10721 1726773081.56932: Set connection var ansible_shell_type to sh 10721 1726773081.56938: Set connection var ansible_shell_executable to /bin/sh 10721 1726773081.56942: Set connection var ansible_connection to ssh 10721 1726773081.56956: variable 'ansible_shell_executable' from source: unknown 10721 1726773081.56959: variable 'ansible_connection' from source: unknown 10721 1726773081.56961: variable 'ansible_module_compression' from source: unknown 10721 1726773081.56962: variable 'ansible_shell_type' from source: unknown 10721 1726773081.56964: variable 'ansible_shell_executable' from source: unknown 10721 1726773081.56966: variable 'ansible_host' from source: host vars for 'managed_node2' 10721 1726773081.56968: variable 'ansible_pipelining' from source: unknown 10721 1726773081.56969: variable 'ansible_timeout' from source: unknown 10721 1726773081.56972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10721 1726773081.57068: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10721 1726773081.57081: variable 'omit' from source: magic vars 10721 1726773081.57088: starting attempt loop 10721 1726773081.57092: running the handler 10721 1726773081.57101: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10721 1726773081.57118: _low_level_execute_command(): starting 10721 1726773081.57126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10721 1726773081.59526: stdout chunk (state=2): >>>/root <<< 10721 1726773081.59717: stderr chunk (state=3): >>><<< 10721 1726773081.59725: stdout chunk (state=3): >>><<< 10721 1726773081.59747: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10721 1726773081.59762: _low_level_execute_command(): starting 10721 1726773081.59769: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526 `" && echo ansible-tmp-1726773081.597559-10721-201610758472526="` echo /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526 `" ) && sleep 0' 10721 1726773081.62370: stdout chunk (state=2): >>>ansible-tmp-1726773081.597559-10721-201610758472526=/root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526 <<< 10721 1726773081.62501: stderr chunk (state=3): >>><<< 10721 1726773081.62509: stdout chunk (state=3): >>><<< 10721 1726773081.62527: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773081.597559-10721-201610758472526=/root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526 , stderr= 10721 1726773081.62555: variable 'ansible_module_compression' from source: unknown 10721 1726773081.62607: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10721 1726773081.62639: variable 'ansible_facts' from source: unknown 10721 1726773081.62715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/AnsiballZ_command.py 10721 1726773081.62825: Sending initial data 10721 1726773081.62832: Sent initial data (154 bytes) 10721 1726773081.65373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpxg93ofxw /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/AnsiballZ_command.py <<< 10721 1726773081.66479: stderr chunk (state=3): >>><<< 10721 1726773081.66490: stdout chunk (state=3): >>><<< 10721 1726773081.66513: done transferring module to remote 10721 1726773081.66523: _low_level_execute_command(): starting 10721 1726773081.66528: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/ /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/AnsiballZ_command.py && sleep 0' 10721 1726773081.68941: stderr chunk (state=2): >>><<< 10721 1726773081.68952: stdout chunk (state=2): >>><<< 10721 1726773081.68968: _low_level_execute_command() done: rc=0, stdout=, stderr= 10721 1726773081.68972: _low_level_execute_command(): starting 10721 1726773081.68978: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/AnsiballZ_command.py && sleep 0' 10721 1726773081.84566: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "start": "2024-09-19 15:11:21.837832", "end": "2024-09-19 15:11:21.843732", "delta": "0:00:00.005900", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10721 1726773081.85710: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10721 1726773081.85759: stderr chunk (state=3): >>><<< 10721 1726773081.85766: stdout chunk (state=3): >>><<< 10721 1726773081.85783: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "start": "2024-09-19 15:11:21.837832", "end": "2024-09-19 15:11:21.843732", "delta": "0:00:00.005900", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10721 1726773081.85827: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10721 1726773081.85838: _low_level_execute_command(): starting 10721 1726773081.85845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773081.597559-10721-201610758472526/ > /dev/null 2>&1 && sleep 0' 10721 1726773081.88262: stderr chunk (state=2): >>><<< 10721 1726773081.88274: stdout chunk (state=2): >>><<< 10721 1726773081.88290: _low_level_execute_command() done: rc=0, stdout=, stderr= 10721 1726773081.88299: handler run complete 10721 1726773081.88317: Evaluated conditional (False): False 10721 1726773081.88326: attempt loop complete, returning result 10721 1726773081.88329: _execute() done 10721 1726773081.88332: dumping result to json 10721 1726773081.88338: done dumping result, returning 10721 1726773081.88344: done running TaskExecutor() for managed_node2/TASK: Check sysctl after reboot [0affffe7-6841-885f-bbcf-00000000001e] 10721 1726773081.88350: sending task result for task 0affffe7-6841-885f-bbcf-00000000001e 10721 1726773081.88383: done sending task result for task 0affffe7-6841-885f-bbcf-00000000001e 10721 1726773081.88388: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lxvq 400000", "delta": "0:00:00.005900", "end": "2024-09-19 15:11:21.843732", "rc": 0, "start": "2024-09-19 15:11:21.837832" } 8240 1726773081.88529: no more pending results, returning what we have 8240 1726773081.88532: results queue empty 8240 1726773081.88533: checking for any_errors_fatal 8240 1726773081.88541: done checking for any_errors_fatal 8240 1726773081.88542: checking for max_fail_percentage 8240 1726773081.88543: done checking for max_fail_percentage 8240 1726773081.88544: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.88545: done checking to see if all hosts have failed 8240 1726773081.88545: getting the remaining hosts for this loop 8240 1726773081.88546: done getting the remaining hosts for this loop 8240 1726773081.88550: getting the next task for host managed_node2 8240 1726773081.88556: done getting next task for host managed_node2 8240 1726773081.88558: ^ task is: TASK: Apply kernel_settings for removing 8240 1726773081.88560: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.88563: getting variables 8240 1726773081.88565: in VariableManager get_vars() 8240 1726773081.88602: Calling all_inventory to load vars for managed_node2 8240 1726773081.88605: Calling groups_inventory to load vars for managed_node2 8240 1726773081.88607: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.88617: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.88620: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.88623: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.88742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.88887: done with get_vars() 8240 1726773081.88895: done getting variables TASK [Apply kernel_settings for removing] ************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:147 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.329) 0:01:00.533 **** 8240 1726773081.88964: entering _queue_task() for managed_node2/include_role 8240 1726773081.89134: worker is 1 (out of 1 available) 8240 1726773081.89150: exiting _queue_task() for managed_node2/include_role 8240 1726773081.89163: done queuing things up, now waiting for results queue to drain 8240 1726773081.89165: waiting for pending results... 10729 1726773081.89288: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing 10729 1726773081.89397: in run() - task 0affffe7-6841-885f-bbcf-00000000001f 10729 1726773081.89415: variable 'ansible_search_path' from source: unknown 10729 1726773081.89444: calling self._execute() 10729 1726773081.89517: variable 'ansible_host' from source: host vars for 'managed_node2' 10729 1726773081.89527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10729 1726773081.89536: variable 'omit' from source: magic vars 10729 1726773081.89610: _execute() done 10729 1726773081.89616: dumping result to json 10729 1726773081.89621: done dumping result, returning 10729 1726773081.89626: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing [0affffe7-6841-885f-bbcf-00000000001f] 10729 1726773081.89634: sending task result for task 0affffe7-6841-885f-bbcf-00000000001f 10729 1726773081.89666: done sending task result for task 0affffe7-6841-885f-bbcf-00000000001f 10729 1726773081.89669: WORKER PROCESS EXITING 8240 1726773081.89780: no more pending results, returning what we have 8240 1726773081.89784: in VariableManager get_vars() 8240 1726773081.89822: Calling all_inventory to load vars for managed_node2 8240 1726773081.89825: Calling groups_inventory to load vars for managed_node2 8240 1726773081.89827: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.89838: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.89840: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.89842: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.89954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.90072: done with get_vars() 8240 1726773081.90077: variable 'ansible_search_path' from source: unknown 8240 1726773081.91656: variable 'omit' from source: magic vars 8240 1726773081.91672: variable 'omit' from source: magic vars 8240 1726773081.91681: variable 'omit' from source: magic vars 8240 1726773081.91684: we have included files to process 8240 1726773081.91686: generating all_blocks data 8240 1726773081.91688: done generating all_blocks data 8240 1726773081.91691: processing included file: fedora.linux_system_roles.kernel_settings 8240 1726773081.91709: in VariableManager get_vars() 8240 1726773081.91722: done with get_vars() 8240 1726773081.91741: in VariableManager get_vars() 8240 1726773081.91751: done with get_vars() 8240 1726773081.91778: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8240 1726773081.91824: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8240 1726773081.91839: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8240 1726773081.91883: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8240 1726773081.92211: in VariableManager get_vars() 8240 1726773081.92225: done with get_vars() 8240 1726773081.93057: in VariableManager get_vars() 8240 1726773081.93071: done with get_vars() 8240 1726773081.93174: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8240 1726773081.93561: iterating over new_blocks loaded from include file 8240 1726773081.93575: in VariableManager get_vars() 8240 1726773081.93587: done with get_vars() 8240 1726773081.93589: filtering new block on tags 8240 1726773081.93613: done filtering new block on tags 8240 1726773081.93615: in VariableManager get_vars() 8240 1726773081.93624: done with get_vars() 8240 1726773081.93625: filtering new block on tags 8240 1726773081.93649: done filtering new block on tags 8240 1726773081.93651: in VariableManager get_vars() 8240 1726773081.93660: done with get_vars() 8240 1726773081.93661: filtering new block on tags 8240 1726773081.93738: done filtering new block on tags 8240 1726773081.93740: in VariableManager get_vars() 8240 1726773081.93752: done with get_vars() 8240 1726773081.93753: filtering new block on tags 8240 1726773081.93780: done filtering new block on tags 8240 1726773081.93781: done iterating over new_blocks loaded from include file 8240 1726773081.93781: extending task lists for all hosts with included blocks 8240 1726773081.95255: done extending task lists 8240 1726773081.95256: done processing included files 8240 1726773081.95257: results queue empty 8240 1726773081.95257: checking for any_errors_fatal 8240 1726773081.95262: done checking for any_errors_fatal 8240 1726773081.95262: checking for max_fail_percentage 8240 1726773081.95263: done checking for max_fail_percentage 8240 1726773081.95263: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.95264: done checking to see if all hosts have failed 8240 1726773081.95264: getting the remaining hosts for this loop 8240 1726773081.95265: done getting the remaining hosts for this loop 8240 1726773081.95267: getting the next task for host managed_node2 8240 1726773081.95270: done getting next task for host managed_node2 8240 1726773081.95271: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8240 1726773081.95273: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.95280: getting variables 8240 1726773081.95281: in VariableManager get_vars() 8240 1726773081.95291: Calling all_inventory to load vars for managed_node2 8240 1726773081.95293: Calling groups_inventory to load vars for managed_node2 8240 1726773081.95294: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.95298: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.95300: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.95301: Calling groups_plugins_play to load vars for managed_node2 8240 1726773081.95398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773081.95509: done with get_vars() 8240 1726773081.95516: done getting variables 8240 1726773081.95542: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:21 -0400 (0:00:00.065) 0:01:00.599 **** 8240 1726773081.95564: entering _queue_task() for managed_node2/fail 8240 1726773081.95761: worker is 1 (out of 1 available) 8240 1726773081.95775: exiting _queue_task() for managed_node2/fail 8240 1726773081.95790: done queuing things up, now waiting for results queue to drain 8240 1726773081.95791: waiting for pending results... 10730 1726773081.95918: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 10730 1726773081.96043: in run() - task 0affffe7-6841-885f-bbcf-00000000060d 10730 1726773081.96059: variable 'ansible_search_path' from source: unknown 10730 1726773081.96063: variable 'ansible_search_path' from source: unknown 10730 1726773081.96092: calling self._execute() 10730 1726773081.96166: variable 'ansible_host' from source: host vars for 'managed_node2' 10730 1726773081.96174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10730 1726773081.96182: variable 'omit' from source: magic vars 10730 1726773081.96540: variable 'kernel_settings_sysctl' from source: include params 10730 1726773081.96557: variable '__kernel_settings_state_empty' from source: role '' all vars 10730 1726773081.96567: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 10730 1726773081.96770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10730 1726773081.98517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10730 1726773081.98563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10730 1726773081.98594: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10730 1726773081.98625: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10730 1726773081.98646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10730 1726773081.98704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10730 1726773081.98729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10730 1726773081.98749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10730 1726773081.98777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10730 1726773081.98790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10730 1726773081.98832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10730 1726773081.98850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10730 1726773081.98867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10730 1726773081.98895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10730 1726773081.98908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10730 1726773081.98939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10730 1726773081.98957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10730 1726773081.98974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10730 1726773081.99002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10730 1726773081.99014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10730 1726773081.99195: variable 'kernel_settings_sysctl' from source: include params 10730 1726773081.99246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10730 1726773081.99354: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10730 1726773081.99383: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10730 1726773081.99410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10730 1726773081.99446: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10730 1726773081.99477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10730 1726773081.99497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10730 1726773081.99517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10730 1726773081.99535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10730 1726773081.99561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10730 1726773081.99578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10730 1726773081.99598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10730 1726773081.99618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10730 1726773081.99637: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 10730 1726773081.99642: when evaluation is False, skipping this task 10730 1726773081.99646: _execute() done 10730 1726773081.99649: dumping result to json 10730 1726773081.99652: done dumping result, returning 10730 1726773081.99659: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-885f-bbcf-00000000060d] 10730 1726773081.99665: sending task result for task 0affffe7-6841-885f-bbcf-00000000060d 10730 1726773081.99690: done sending task result for task 0affffe7-6841-885f-bbcf-00000000060d 10730 1726773081.99694: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8240 1726773081.99811: no more pending results, returning what we have 8240 1726773081.99815: results queue empty 8240 1726773081.99816: checking for any_errors_fatal 8240 1726773081.99818: done checking for any_errors_fatal 8240 1726773081.99818: checking for max_fail_percentage 8240 1726773081.99820: done checking for max_fail_percentage 8240 1726773081.99820: checking to see if all hosts have failed and the running result is not ok 8240 1726773081.99821: done checking to see if all hosts have failed 8240 1726773081.99822: getting the remaining hosts for this loop 8240 1726773081.99823: done getting the remaining hosts for this loop 8240 1726773081.99827: getting the next task for host managed_node2 8240 1726773081.99833: done getting next task for host managed_node2 8240 1726773081.99836: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8240 1726773081.99839: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773081.99854: getting variables 8240 1726773081.99856: in VariableManager get_vars() 8240 1726773081.99893: Calling all_inventory to load vars for managed_node2 8240 1726773081.99896: Calling groups_inventory to load vars for managed_node2 8240 1726773081.99898: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773081.99908: Calling all_plugins_play to load vars for managed_node2 8240 1726773081.99911: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773081.99913: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.00038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.00157: done with get_vars() 8240 1726773082.00166: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.046) 0:01:00.646 **** 8240 1726773082.00236: entering _queue_task() for managed_node2/include_tasks 8240 1726773082.00401: worker is 1 (out of 1 available) 8240 1726773082.00417: exiting _queue_task() for managed_node2/include_tasks 8240 1726773082.00430: done queuing things up, now waiting for results queue to drain 8240 1726773082.00431: waiting for pending results... 10731 1726773082.00557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 10731 1726773082.00675: in run() - task 0affffe7-6841-885f-bbcf-00000000060e 10731 1726773082.00692: variable 'ansible_search_path' from source: unknown 10731 1726773082.00697: variable 'ansible_search_path' from source: unknown 10731 1726773082.00727: calling self._execute() 10731 1726773082.00797: variable 'ansible_host' from source: host vars for 'managed_node2' 10731 1726773082.00807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10731 1726773082.00817: variable 'omit' from source: magic vars 10731 1726773082.00889: _execute() done 10731 1726773082.00896: dumping result to json 10731 1726773082.00902: done dumping result, returning 10731 1726773082.00908: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-885f-bbcf-00000000060e] 10731 1726773082.00915: sending task result for task 0affffe7-6841-885f-bbcf-00000000060e 10731 1726773082.00940: done sending task result for task 0affffe7-6841-885f-bbcf-00000000060e 10731 1726773082.00942: WORKER PROCESS EXITING 8240 1726773082.01129: no more pending results, returning what we have 8240 1726773082.01132: in VariableManager get_vars() 8240 1726773082.01164: Calling all_inventory to load vars for managed_node2 8240 1726773082.01166: Calling groups_inventory to load vars for managed_node2 8240 1726773082.01167: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.01175: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.01177: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.01179: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.01317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.01429: done with get_vars() 8240 1726773082.01434: variable 'ansible_search_path' from source: unknown 8240 1726773082.01434: variable 'ansible_search_path' from source: unknown 8240 1726773082.01456: we have included files to process 8240 1726773082.01457: generating all_blocks data 8240 1726773082.01458: done generating all_blocks data 8240 1726773082.01463: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773082.01464: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773082.01465: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8240 1726773082.01912: done processing included file 8240 1726773082.01914: iterating over new_blocks loaded from include file 8240 1726773082.01915: in VariableManager get_vars() 8240 1726773082.01931: done with get_vars() 8240 1726773082.01932: filtering new block on tags 8240 1726773082.01949: done filtering new block on tags 8240 1726773082.01951: in VariableManager get_vars() 8240 1726773082.01963: done with get_vars() 8240 1726773082.01964: filtering new block on tags 8240 1726773082.01989: done filtering new block on tags 8240 1726773082.01991: in VariableManager get_vars() 8240 1726773082.02005: done with get_vars() 8240 1726773082.02006: filtering new block on tags 8240 1726773082.02030: done filtering new block on tags 8240 1726773082.02032: in VariableManager get_vars() 8240 1726773082.02046: done with get_vars() 8240 1726773082.02047: filtering new block on tags 8240 1726773082.02062: done filtering new block on tags 8240 1726773082.02063: done iterating over new_blocks loaded from include file 8240 1726773082.02064: extending task lists for all hosts with included blocks 8240 1726773082.02181: done extending task lists 8240 1726773082.02182: done processing included files 8240 1726773082.02183: results queue empty 8240 1726773082.02183: checking for any_errors_fatal 8240 1726773082.02187: done checking for any_errors_fatal 8240 1726773082.02188: checking for max_fail_percentage 8240 1726773082.02188: done checking for max_fail_percentage 8240 1726773082.02189: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.02189: done checking to see if all hosts have failed 8240 1726773082.02190: getting the remaining hosts for this loop 8240 1726773082.02190: done getting the remaining hosts for this loop 8240 1726773082.02192: getting the next task for host managed_node2 8240 1726773082.02195: done getting next task for host managed_node2 8240 1726773082.02196: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8240 1726773082.02199: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.02205: getting variables 8240 1726773082.02206: in VariableManager get_vars() 8240 1726773082.02214: Calling all_inventory to load vars for managed_node2 8240 1726773082.02215: Calling groups_inventory to load vars for managed_node2 8240 1726773082.02216: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.02219: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.02220: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.02222: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.02298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.02405: done with get_vars() 8240 1726773082.02411: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.022) 0:01:00.668 **** 8240 1726773082.02458: entering _queue_task() for managed_node2/setup 8240 1726773082.02622: worker is 1 (out of 1 available) 8240 1726773082.02634: exiting _queue_task() for managed_node2/setup 8240 1726773082.02647: done queuing things up, now waiting for results queue to drain 8240 1726773082.02649: waiting for pending results... 10732 1726773082.02776: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 10732 1726773082.02928: in run() - task 0affffe7-6841-885f-bbcf-000000000789 10732 1726773082.02944: variable 'ansible_search_path' from source: unknown 10732 1726773082.02948: variable 'ansible_search_path' from source: unknown 10732 1726773082.02974: calling self._execute() 10732 1726773082.03042: variable 'ansible_host' from source: host vars for 'managed_node2' 10732 1726773082.03051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10732 1726773082.03060: variable 'omit' from source: magic vars 10732 1726773082.03437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10732 1726773082.05504: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10732 1726773082.05551: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10732 1726773082.05581: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10732 1726773082.05612: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10732 1726773082.05634: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10732 1726773082.05690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10732 1726773082.05717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10732 1726773082.05737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10732 1726773082.05763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10732 1726773082.05774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10732 1726773082.05818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10732 1726773082.05836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10732 1726773082.05853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10732 1726773082.05878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10732 1726773082.05891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10732 1726773082.06008: variable '__kernel_settings_required_facts' from source: role '' all vars 10732 1726773082.06018: variable 'ansible_facts' from source: unknown 10732 1726773082.06074: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10732 1726773082.06079: when evaluation is False, skipping this task 10732 1726773082.06086: _execute() done 10732 1726773082.06091: dumping result to json 10732 1726773082.06095: done dumping result, returning 10732 1726773082.06103: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-885f-bbcf-000000000789] 10732 1726773082.06109: sending task result for task 0affffe7-6841-885f-bbcf-000000000789 10732 1726773082.06132: done sending task result for task 0affffe7-6841-885f-bbcf-000000000789 10732 1726773082.06135: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8240 1726773082.06242: no more pending results, returning what we have 8240 1726773082.06246: results queue empty 8240 1726773082.06247: checking for any_errors_fatal 8240 1726773082.06249: done checking for any_errors_fatal 8240 1726773082.06249: checking for max_fail_percentage 8240 1726773082.06250: done checking for max_fail_percentage 8240 1726773082.06251: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.06252: done checking to see if all hosts have failed 8240 1726773082.06253: getting the remaining hosts for this loop 8240 1726773082.06254: done getting the remaining hosts for this loop 8240 1726773082.06257: getting the next task for host managed_node2 8240 1726773082.06266: done getting next task for host managed_node2 8240 1726773082.06270: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8240 1726773082.06274: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.06292: getting variables 8240 1726773082.06293: in VariableManager get_vars() 8240 1726773082.06328: Calling all_inventory to load vars for managed_node2 8240 1726773082.06330: Calling groups_inventory to load vars for managed_node2 8240 1726773082.06332: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.06342: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.06345: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.06347: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.06466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.06771: done with get_vars() 8240 1726773082.06779: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.043) 0:01:00.712 **** 8240 1726773082.06848: entering _queue_task() for managed_node2/stat 8240 1726773082.07011: worker is 1 (out of 1 available) 8240 1726773082.07025: exiting _queue_task() for managed_node2/stat 8240 1726773082.07037: done queuing things up, now waiting for results queue to drain 8240 1726773082.07039: waiting for pending results... 10736 1726773082.07168: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 10736 1726773082.07306: in run() - task 0affffe7-6841-885f-bbcf-00000000078b 10736 1726773082.07323: variable 'ansible_search_path' from source: unknown 10736 1726773082.07327: variable 'ansible_search_path' from source: unknown 10736 1726773082.07354: calling self._execute() 10736 1726773082.07427: variable 'ansible_host' from source: host vars for 'managed_node2' 10736 1726773082.07435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10736 1726773082.07444: variable 'omit' from source: magic vars 10736 1726773082.07778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10736 1726773082.07957: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10736 1726773082.07994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10736 1726773082.08024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10736 1726773082.08051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10736 1726773082.08114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10736 1726773082.08136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10736 1726773082.08155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10736 1726773082.08175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10736 1726773082.08268: variable '__kernel_settings_is_ostree' from source: set_fact 10736 1726773082.08279: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10736 1726773082.08284: when evaluation is False, skipping this task 10736 1726773082.08293: _execute() done 10736 1726773082.08297: dumping result to json 10736 1726773082.08303: done dumping result, returning 10736 1726773082.08310: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-885f-bbcf-00000000078b] 10736 1726773082.08315: sending task result for task 0affffe7-6841-885f-bbcf-00000000078b 10736 1726773082.08338: done sending task result for task 0affffe7-6841-885f-bbcf-00000000078b 10736 1726773082.08341: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773082.08448: no more pending results, returning what we have 8240 1726773082.08451: results queue empty 8240 1726773082.08452: checking for any_errors_fatal 8240 1726773082.08459: done checking for any_errors_fatal 8240 1726773082.08459: checking for max_fail_percentage 8240 1726773082.08460: done checking for max_fail_percentage 8240 1726773082.08461: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.08462: done checking to see if all hosts have failed 8240 1726773082.08462: getting the remaining hosts for this loop 8240 1726773082.08464: done getting the remaining hosts for this loop 8240 1726773082.08467: getting the next task for host managed_node2 8240 1726773082.08473: done getting next task for host managed_node2 8240 1726773082.08477: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8240 1726773082.08480: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.08497: getting variables 8240 1726773082.08499: in VariableManager get_vars() 8240 1726773082.08532: Calling all_inventory to load vars for managed_node2 8240 1726773082.08534: Calling groups_inventory to load vars for managed_node2 8240 1726773082.08536: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.08545: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.08547: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.08549: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.08658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.08780: done with get_vars() 8240 1726773082.08790: done getting variables 8240 1726773082.08829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.020) 0:01:00.732 **** 8240 1726773082.08852: entering _queue_task() for managed_node2/set_fact 8240 1726773082.09004: worker is 1 (out of 1 available) 8240 1726773082.09017: exiting _queue_task() for managed_node2/set_fact 8240 1726773082.09030: done queuing things up, now waiting for results queue to drain 8240 1726773082.09032: waiting for pending results... 10737 1726773082.09155: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 10737 1726773082.09287: in run() - task 0affffe7-6841-885f-bbcf-00000000078c 10737 1726773082.09304: variable 'ansible_search_path' from source: unknown 10737 1726773082.09308: variable 'ansible_search_path' from source: unknown 10737 1726773082.09335: calling self._execute() 10737 1726773082.09404: variable 'ansible_host' from source: host vars for 'managed_node2' 10737 1726773082.09413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10737 1726773082.09422: variable 'omit' from source: magic vars 10737 1726773082.09746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10737 1726773082.09967: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10737 1726773082.10007: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10737 1726773082.10034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10737 1726773082.10057: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10737 1726773082.10118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10737 1726773082.10138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10737 1726773082.10157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10737 1726773082.10176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10737 1726773082.10265: variable '__kernel_settings_is_ostree' from source: set_fact 10737 1726773082.10277: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 10737 1726773082.10281: when evaluation is False, skipping this task 10737 1726773082.10286: _execute() done 10737 1726773082.10290: dumping result to json 10737 1726773082.10294: done dumping result, returning 10737 1726773082.10302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-00000000078c] 10737 1726773082.10307: sending task result for task 0affffe7-6841-885f-bbcf-00000000078c 10737 1726773082.10327: done sending task result for task 0affffe7-6841-885f-bbcf-00000000078c 10737 1726773082.10329: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773082.10510: no more pending results, returning what we have 8240 1726773082.10513: results queue empty 8240 1726773082.10514: checking for any_errors_fatal 8240 1726773082.10518: done checking for any_errors_fatal 8240 1726773082.10519: checking for max_fail_percentage 8240 1726773082.10522: done checking for max_fail_percentage 8240 1726773082.10523: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.10523: done checking to see if all hosts have failed 8240 1726773082.10524: getting the remaining hosts for this loop 8240 1726773082.10525: done getting the remaining hosts for this loop 8240 1726773082.10528: getting the next task for host managed_node2 8240 1726773082.10534: done getting next task for host managed_node2 8240 1726773082.10537: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8240 1726773082.10539: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.10551: getting variables 8240 1726773082.10552: in VariableManager get_vars() 8240 1726773082.10576: Calling all_inventory to load vars for managed_node2 8240 1726773082.10578: Calling groups_inventory to load vars for managed_node2 8240 1726773082.10579: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.10588: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.10590: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.10591: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.10740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.10856: done with get_vars() 8240 1726773082.10863: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.020) 0:01:00.753 **** 8240 1726773082.10927: entering _queue_task() for managed_node2/stat 8240 1726773082.11070: worker is 1 (out of 1 available) 8240 1726773082.11084: exiting _queue_task() for managed_node2/stat 8240 1726773082.11098: done queuing things up, now waiting for results queue to drain 8240 1726773082.11099: waiting for pending results... 10738 1726773082.11222: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 10738 1726773082.11353: in run() - task 0affffe7-6841-885f-bbcf-00000000078e 10738 1726773082.11370: variable 'ansible_search_path' from source: unknown 10738 1726773082.11374: variable 'ansible_search_path' from source: unknown 10738 1726773082.11404: calling self._execute() 10738 1726773082.11470: variable 'ansible_host' from source: host vars for 'managed_node2' 10738 1726773082.11479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10738 1726773082.11489: variable 'omit' from source: magic vars 10738 1726773082.11813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10738 1726773082.11980: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10738 1726773082.12017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10738 1726773082.12044: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10738 1726773082.12070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10738 1726773082.12132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10738 1726773082.12152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10738 1726773082.12173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10738 1726773082.12194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10738 1726773082.12282: variable '__kernel_settings_is_transactional' from source: set_fact 10738 1726773082.12295: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10738 1726773082.12303: when evaluation is False, skipping this task 10738 1726773082.12307: _execute() done 10738 1726773082.12311: dumping result to json 10738 1726773082.12315: done dumping result, returning 10738 1726773082.12321: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-885f-bbcf-00000000078e] 10738 1726773082.12326: sending task result for task 0affffe7-6841-885f-bbcf-00000000078e 10738 1726773082.12350: done sending task result for task 0affffe7-6841-885f-bbcf-00000000078e 10738 1726773082.12353: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773082.12463: no more pending results, returning what we have 8240 1726773082.12466: results queue empty 8240 1726773082.12467: checking for any_errors_fatal 8240 1726773082.12472: done checking for any_errors_fatal 8240 1726773082.12473: checking for max_fail_percentage 8240 1726773082.12474: done checking for max_fail_percentage 8240 1726773082.12474: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.12475: done checking to see if all hosts have failed 8240 1726773082.12476: getting the remaining hosts for this loop 8240 1726773082.12477: done getting the remaining hosts for this loop 8240 1726773082.12480: getting the next task for host managed_node2 8240 1726773082.12489: done getting next task for host managed_node2 8240 1726773082.12492: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8240 1726773082.12495: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.12511: getting variables 8240 1726773082.12512: in VariableManager get_vars() 8240 1726773082.12546: Calling all_inventory to load vars for managed_node2 8240 1726773082.12549: Calling groups_inventory to load vars for managed_node2 8240 1726773082.12550: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.12559: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.12561: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.12563: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.12673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.12804: done with get_vars() 8240 1726773082.12811: done getting variables 8240 1726773082.12850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.019) 0:01:00.772 **** 8240 1726773082.12874: entering _queue_task() for managed_node2/set_fact 8240 1726773082.13036: worker is 1 (out of 1 available) 8240 1726773082.13051: exiting _queue_task() for managed_node2/set_fact 8240 1726773082.13064: done queuing things up, now waiting for results queue to drain 8240 1726773082.13065: waiting for pending results... 10739 1726773082.13195: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 10739 1726773082.13328: in run() - task 0affffe7-6841-885f-bbcf-00000000078f 10739 1726773082.13345: variable 'ansible_search_path' from source: unknown 10739 1726773082.13349: variable 'ansible_search_path' from source: unknown 10739 1726773082.13375: calling self._execute() 10739 1726773082.13451: variable 'ansible_host' from source: host vars for 'managed_node2' 10739 1726773082.13460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10739 1726773082.13470: variable 'omit' from source: magic vars 10739 1726773082.13808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10739 1726773082.14046: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10739 1726773082.14078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10739 1726773082.14104: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10739 1726773082.14127: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10739 1726773082.14187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10739 1726773082.14208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10739 1726773082.14227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10739 1726773082.14245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10739 1726773082.14332: variable '__kernel_settings_is_transactional' from source: set_fact 10739 1726773082.14344: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 10739 1726773082.14349: when evaluation is False, skipping this task 10739 1726773082.14352: _execute() done 10739 1726773082.14356: dumping result to json 10739 1726773082.14360: done dumping result, returning 10739 1726773082.14365: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-885f-bbcf-00000000078f] 10739 1726773082.14372: sending task result for task 0affffe7-6841-885f-bbcf-00000000078f 10739 1726773082.14395: done sending task result for task 0affffe7-6841-885f-bbcf-00000000078f 10739 1726773082.14398: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773082.14619: no more pending results, returning what we have 8240 1726773082.14622: results queue empty 8240 1726773082.14623: checking for any_errors_fatal 8240 1726773082.14629: done checking for any_errors_fatal 8240 1726773082.14629: checking for max_fail_percentage 8240 1726773082.14631: done checking for max_fail_percentage 8240 1726773082.14631: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.14632: done checking to see if all hosts have failed 8240 1726773082.14632: getting the remaining hosts for this loop 8240 1726773082.14633: done getting the remaining hosts for this loop 8240 1726773082.14635: getting the next task for host managed_node2 8240 1726773082.14642: done getting next task for host managed_node2 8240 1726773082.14644: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8240 1726773082.14647: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.14658: getting variables 8240 1726773082.14659: in VariableManager get_vars() 8240 1726773082.14683: Calling all_inventory to load vars for managed_node2 8240 1726773082.14687: Calling groups_inventory to load vars for managed_node2 8240 1726773082.14688: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.14695: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.14697: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.14699: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.14856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.14973: done with get_vars() 8240 1726773082.14979: done getting variables 8240 1726773082.15023: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.021) 0:01:00.794 **** 8240 1726773082.15046: entering _queue_task() for managed_node2/include_vars 8240 1726773082.15207: worker is 1 (out of 1 available) 8240 1726773082.15221: exiting _queue_task() for managed_node2/include_vars 8240 1726773082.15234: done queuing things up, now waiting for results queue to drain 8240 1726773082.15235: waiting for pending results... 10740 1726773082.15354: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 10740 1726773082.15477: in run() - task 0affffe7-6841-885f-bbcf-000000000791 10740 1726773082.15494: variable 'ansible_search_path' from source: unknown 10740 1726773082.15498: variable 'ansible_search_path' from source: unknown 10740 1726773082.15522: calling self._execute() 10740 1726773082.15591: variable 'ansible_host' from source: host vars for 'managed_node2' 10740 1726773082.15597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10740 1726773082.15604: variable 'omit' from source: magic vars 10740 1726773082.15669: variable 'omit' from source: magic vars 10740 1726773082.15722: variable 'omit' from source: magic vars 10740 1726773082.15983: variable 'ffparams' from source: task vars 10740 1726773082.16079: variable 'ansible_facts' from source: unknown 10740 1726773082.16211: variable 'ansible_facts' from source: unknown 10740 1726773082.16299: variable 'ansible_facts' from source: unknown 10740 1726773082.16387: variable 'ansible_facts' from source: unknown 10740 1726773082.16464: variable 'role_path' from source: magic vars 10740 1726773082.16579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10740 1726773082.16736: Loaded config def from plugin (lookup/first_found) 10740 1726773082.16744: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 10740 1726773082.16774: variable 'ansible_search_path' from source: unknown 10740 1726773082.16798: variable 'ansible_search_path' from source: unknown 10740 1726773082.16809: variable 'ansible_search_path' from source: unknown 10740 1726773082.16816: variable 'ansible_search_path' from source: unknown 10740 1726773082.16823: variable 'ansible_search_path' from source: unknown 10740 1726773082.16837: variable 'omit' from source: magic vars 10740 1726773082.16854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10740 1726773082.16874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10740 1726773082.16890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10740 1726773082.16905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10740 1726773082.16915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10740 1726773082.16936: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10740 1726773082.16941: variable 'ansible_host' from source: host vars for 'managed_node2' 10740 1726773082.16946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10740 1726773082.17010: Set connection var ansible_pipelining to False 10740 1726773082.17017: Set connection var ansible_timeout to 10 10740 1726773082.17022: Set connection var ansible_module_compression to ZIP_DEFLATED 10740 1726773082.17024: Set connection var ansible_shell_type to sh 10740 1726773082.17027: Set connection var ansible_shell_executable to /bin/sh 10740 1726773082.17030: Set connection var ansible_connection to ssh 10740 1726773082.17043: variable 'ansible_shell_executable' from source: unknown 10740 1726773082.17046: variable 'ansible_connection' from source: unknown 10740 1726773082.17047: variable 'ansible_module_compression' from source: unknown 10740 1726773082.17049: variable 'ansible_shell_type' from source: unknown 10740 1726773082.17051: variable 'ansible_shell_executable' from source: unknown 10740 1726773082.17052: variable 'ansible_host' from source: host vars for 'managed_node2' 10740 1726773082.17054: variable 'ansible_pipelining' from source: unknown 10740 1726773082.17056: variable 'ansible_timeout' from source: unknown 10740 1726773082.17058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10740 1726773082.17135: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10740 1726773082.17146: variable 'omit' from source: magic vars 10740 1726773082.17151: starting attempt loop 10740 1726773082.17155: running the handler 10740 1726773082.17200: handler run complete 10740 1726773082.17211: attempt loop complete, returning result 10740 1726773082.17215: _execute() done 10740 1726773082.17218: dumping result to json 10740 1726773082.17222: done dumping result, returning 10740 1726773082.17228: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-885f-bbcf-000000000791] 10740 1726773082.17235: sending task result for task 0affffe7-6841-885f-bbcf-000000000791 10740 1726773082.17258: done sending task result for task 0affffe7-6841-885f-bbcf-000000000791 10740 1726773082.17261: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8240 1726773082.17405: no more pending results, returning what we have 8240 1726773082.17408: results queue empty 8240 1726773082.17409: checking for any_errors_fatal 8240 1726773082.17415: done checking for any_errors_fatal 8240 1726773082.17416: checking for max_fail_percentage 8240 1726773082.17417: done checking for max_fail_percentage 8240 1726773082.17418: checking to see if all hosts have failed and the running result is not ok 8240 1726773082.17419: done checking to see if all hosts have failed 8240 1726773082.17419: getting the remaining hosts for this loop 8240 1726773082.17420: done getting the remaining hosts for this loop 8240 1726773082.17424: getting the next task for host managed_node2 8240 1726773082.17431: done getting next task for host managed_node2 8240 1726773082.17434: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8240 1726773082.17436: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773082.17447: getting variables 8240 1726773082.17449: in VariableManager get_vars() 8240 1726773082.17476: Calling all_inventory to load vars for managed_node2 8240 1726773082.17478: Calling groups_inventory to load vars for managed_node2 8240 1726773082.17479: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773082.17488: Calling all_plugins_play to load vars for managed_node2 8240 1726773082.17490: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773082.17492: Calling groups_plugins_play to load vars for managed_node2 8240 1726773082.17603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773082.17727: done with get_vars() 8240 1726773082.17735: done getting variables 8240 1726773082.17773: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:22 -0400 (0:00:00.027) 0:01:00.821 **** 8240 1726773082.17798: entering _queue_task() for managed_node2/package 8240 1726773082.17953: worker is 1 (out of 1 available) 8240 1726773082.17968: exiting _queue_task() for managed_node2/package 8240 1726773082.17982: done queuing things up, now waiting for results queue to drain 8240 1726773082.17984: waiting for pending results... 10741 1726773082.18106: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 10741 1726773082.18218: in run() - task 0affffe7-6841-885f-bbcf-00000000060f 10741 1726773082.18234: variable 'ansible_search_path' from source: unknown 10741 1726773082.18238: variable 'ansible_search_path' from source: unknown 10741 1726773082.18264: calling self._execute() 10741 1726773082.18335: variable 'ansible_host' from source: host vars for 'managed_node2' 10741 1726773082.18343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10741 1726773082.18351: variable 'omit' from source: magic vars 10741 1726773082.18424: variable 'omit' from source: magic vars 10741 1726773082.18459: variable 'omit' from source: magic vars 10741 1726773082.18479: variable '__kernel_settings_packages' from source: include_vars 10741 1726773082.18692: variable '__kernel_settings_packages' from source: include_vars 10741 1726773082.18907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10741 1726773082.20354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10741 1726773082.20405: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10741 1726773082.20434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10741 1726773082.20460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10741 1726773082.20492: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10741 1726773082.20558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10741 1726773082.20579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10741 1726773082.20600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10741 1726773082.20625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10741 1726773082.20634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10741 1726773082.20703: variable '__kernel_settings_is_ostree' from source: set_fact 10741 1726773082.20709: variable 'omit' from source: magic vars 10741 1726773082.20729: variable 'omit' from source: magic vars 10741 1726773082.20746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10741 1726773082.20764: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10741 1726773082.20778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10741 1726773082.20793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10741 1726773082.20808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10741 1726773082.20834: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10741 1726773082.20839: variable 'ansible_host' from source: host vars for 'managed_node2' 10741 1726773082.20844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10741 1726773082.20911: Set connection var ansible_pipelining to False 10741 1726773082.20918: Set connection var ansible_timeout to 10 10741 1726773082.20927: Set connection var ansible_module_compression to ZIP_DEFLATED 10741 1726773082.20930: Set connection var ansible_shell_type to sh 10741 1726773082.20935: Set connection var ansible_shell_executable to /bin/sh 10741 1726773082.20940: Set connection var ansible_connection to ssh 10741 1726773082.20956: variable 'ansible_shell_executable' from source: unknown 10741 1726773082.20960: variable 'ansible_connection' from source: unknown 10741 1726773082.20963: variable 'ansible_module_compression' from source: unknown 10741 1726773082.20966: variable 'ansible_shell_type' from source: unknown 10741 1726773082.20970: variable 'ansible_shell_executable' from source: unknown 10741 1726773082.20973: variable 'ansible_host' from source: host vars for 'managed_node2' 10741 1726773082.20977: variable 'ansible_pipelining' from source: unknown 10741 1726773082.20981: variable 'ansible_timeout' from source: unknown 10741 1726773082.20986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10741 1726773082.21052: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10741 1726773082.21063: variable 'omit' from source: magic vars 10741 1726773082.21070: starting attempt loop 10741 1726773082.21073: running the handler 10741 1726773082.21134: variable 'ansible_facts' from source: unknown 10741 1726773082.21216: _low_level_execute_command(): starting 10741 1726773082.21224: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10741 1726773082.23598: stdout chunk (state=2): >>>/root <<< 10741 1726773082.23719: stderr chunk (state=3): >>><<< 10741 1726773082.23726: stdout chunk (state=3): >>><<< 10741 1726773082.23746: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10741 1726773082.23759: _low_level_execute_command(): starting 10741 1726773082.23764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849 `" && echo ansible-tmp-1726773082.237545-10741-28494092558849="` echo /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849 `" ) && sleep 0' 10741 1726773082.26299: stdout chunk (state=2): >>>ansible-tmp-1726773082.237545-10741-28494092558849=/root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849 <<< 10741 1726773082.26447: stderr chunk (state=3): >>><<< 10741 1726773082.26453: stdout chunk (state=3): >>><<< 10741 1726773082.26467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773082.237545-10741-28494092558849=/root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849 , stderr= 10741 1726773082.26492: variable 'ansible_module_compression' from source: unknown 10741 1726773082.26539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10741 1726773082.26572: variable 'ansible_facts' from source: unknown 10741 1726773082.26664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/AnsiballZ_dnf.py 10741 1726773082.26765: Sending initial data 10741 1726773082.26773: Sent initial data (149 bytes) 10741 1726773082.29284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpevms0tdy /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/AnsiballZ_dnf.py <<< 10741 1726773082.30688: stderr chunk (state=3): >>><<< 10741 1726773082.30695: stdout chunk (state=3): >>><<< 10741 1726773082.30717: done transferring module to remote 10741 1726773082.30728: _low_level_execute_command(): starting 10741 1726773082.30733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/ /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/AnsiballZ_dnf.py && sleep 0' 10741 1726773082.33099: stderr chunk (state=2): >>><<< 10741 1726773082.33110: stdout chunk (state=2): >>><<< 10741 1726773082.33125: _low_level_execute_command() done: rc=0, stdout=, stderr= 10741 1726773082.33129: _low_level_execute_command(): starting 10741 1726773082.33134: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/AnsiballZ_dnf.py && sleep 0' 10741 1726773084.91750: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 10741 1726773084.99905: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10741 1726773084.99954: stderr chunk (state=3): >>><<< 10741 1726773084.99961: stdout chunk (state=3): >>><<< 10741 1726773084.99980: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10741 1726773085.00015: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10741 1726773085.00023: _low_level_execute_command(): starting 10741 1726773085.00029: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773082.237545-10741-28494092558849/ > /dev/null 2>&1 && sleep 0' 10741 1726773085.02541: stderr chunk (state=2): >>><<< 10741 1726773085.02550: stdout chunk (state=2): >>><<< 10741 1726773085.02564: _low_level_execute_command() done: rc=0, stdout=, stderr= 10741 1726773085.02572: handler run complete 10741 1726773085.02599: attempt loop complete, returning result 10741 1726773085.02603: _execute() done 10741 1726773085.02607: dumping result to json 10741 1726773085.02612: done dumping result, returning 10741 1726773085.02619: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-885f-bbcf-00000000060f] 10741 1726773085.02626: sending task result for task 0affffe7-6841-885f-bbcf-00000000060f 10741 1726773085.02654: done sending task result for task 0affffe7-6841-885f-bbcf-00000000060f 10741 1726773085.02658: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773085.02850: no more pending results, returning what we have 8240 1726773085.02853: results queue empty 8240 1726773085.02854: checking for any_errors_fatal 8240 1726773085.02863: done checking for any_errors_fatal 8240 1726773085.02864: checking for max_fail_percentage 8240 1726773085.02865: done checking for max_fail_percentage 8240 1726773085.02866: checking to see if all hosts have failed and the running result is not ok 8240 1726773085.02867: done checking to see if all hosts have failed 8240 1726773085.02868: getting the remaining hosts for this loop 8240 1726773085.02869: done getting the remaining hosts for this loop 8240 1726773085.02873: getting the next task for host managed_node2 8240 1726773085.02881: done getting next task for host managed_node2 8240 1726773085.02884: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8240 1726773085.02889: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773085.02899: getting variables 8240 1726773085.02903: in VariableManager get_vars() 8240 1726773085.02930: Calling all_inventory to load vars for managed_node2 8240 1726773085.02932: Calling groups_inventory to load vars for managed_node2 8240 1726773085.02934: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773085.02941: Calling all_plugins_play to load vars for managed_node2 8240 1726773085.02943: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773085.02945: Calling groups_plugins_play to load vars for managed_node2 8240 1726773085.03106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773085.03225: done with get_vars() 8240 1726773085.03234: done getting variables 8240 1726773085.03280: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:25 -0400 (0:00:02.855) 0:01:03.676 **** 8240 1726773085.03308: entering _queue_task() for managed_node2/debug 8240 1726773085.03481: worker is 1 (out of 1 available) 8240 1726773085.03497: exiting _queue_task() for managed_node2/debug 8240 1726773085.03512: done queuing things up, now waiting for results queue to drain 8240 1726773085.03514: waiting for pending results... 10790 1726773085.03642: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 10790 1726773085.03762: in run() - task 0affffe7-6841-885f-bbcf-000000000611 10790 1726773085.03780: variable 'ansible_search_path' from source: unknown 10790 1726773085.03784: variable 'ansible_search_path' from source: unknown 10790 1726773085.03814: calling self._execute() 10790 1726773085.03889: variable 'ansible_host' from source: host vars for 'managed_node2' 10790 1726773085.03898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10790 1726773085.03908: variable 'omit' from source: magic vars 10790 1726773085.04258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10790 1726773085.05798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10790 1726773085.05846: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10790 1726773085.05874: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10790 1726773085.05905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10790 1726773085.05926: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10790 1726773085.05982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10790 1726773085.06008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10790 1726773085.06027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10790 1726773085.06054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10790 1726773085.06065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10790 1726773085.06147: variable '__kernel_settings_is_transactional' from source: set_fact 10790 1726773085.06163: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10790 1726773085.06168: when evaluation is False, skipping this task 10790 1726773085.06172: _execute() done 10790 1726773085.06176: dumping result to json 10790 1726773085.06180: done dumping result, returning 10790 1726773085.06188: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000611] 10790 1726773085.06193: sending task result for task 0affffe7-6841-885f-bbcf-000000000611 10790 1726773085.06218: done sending task result for task 0affffe7-6841-885f-bbcf-000000000611 10790 1726773085.06221: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8240 1726773085.06406: no more pending results, returning what we have 8240 1726773085.06409: results queue empty 8240 1726773085.06410: checking for any_errors_fatal 8240 1726773085.06421: done checking for any_errors_fatal 8240 1726773085.06421: checking for max_fail_percentage 8240 1726773085.06423: done checking for max_fail_percentage 8240 1726773085.06423: checking to see if all hosts have failed and the running result is not ok 8240 1726773085.06424: done checking to see if all hosts have failed 8240 1726773085.06425: getting the remaining hosts for this loop 8240 1726773085.06426: done getting the remaining hosts for this loop 8240 1726773085.06429: getting the next task for host managed_node2 8240 1726773085.06435: done getting next task for host managed_node2 8240 1726773085.06437: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8240 1726773085.06439: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773085.06450: getting variables 8240 1726773085.06451: in VariableManager get_vars() 8240 1726773085.06479: Calling all_inventory to load vars for managed_node2 8240 1726773085.06481: Calling groups_inventory to load vars for managed_node2 8240 1726773085.06483: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773085.06492: Calling all_plugins_play to load vars for managed_node2 8240 1726773085.06494: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773085.06496: Calling groups_plugins_play to load vars for managed_node2 8240 1726773085.06612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773085.06810: done with get_vars() 8240 1726773085.06822: done getting variables 8240 1726773085.06880: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.036) 0:01:03.713 **** 8240 1726773085.06917: entering _queue_task() for managed_node2/reboot 8240 1726773085.07130: worker is 1 (out of 1 available) 8240 1726773085.07145: exiting _queue_task() for managed_node2/reboot 8240 1726773085.07158: done queuing things up, now waiting for results queue to drain 8240 1726773085.07160: waiting for pending results... 10793 1726773085.07393: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 10793 1726773085.07542: in run() - task 0affffe7-6841-885f-bbcf-000000000612 10793 1726773085.07560: variable 'ansible_search_path' from source: unknown 10793 1726773085.07565: variable 'ansible_search_path' from source: unknown 10793 1726773085.07603: calling self._execute() 10793 1726773085.07692: variable 'ansible_host' from source: host vars for 'managed_node2' 10793 1726773085.07705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10793 1726773085.07715: variable 'omit' from source: magic vars 10793 1726773085.08064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10793 1726773085.09618: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10793 1726773085.09666: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10793 1726773085.09696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10793 1726773085.09724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10793 1726773085.09745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10793 1726773085.09805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10793 1726773085.09835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10793 1726773085.09854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10793 1726773085.09883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10793 1726773085.09897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10793 1726773085.09976: variable '__kernel_settings_is_transactional' from source: set_fact 10793 1726773085.09994: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10793 1726773085.09999: when evaluation is False, skipping this task 10793 1726773085.10005: _execute() done 10793 1726773085.10009: dumping result to json 10793 1726773085.10013: done dumping result, returning 10793 1726773085.10019: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-885f-bbcf-000000000612] 10793 1726773085.10024: sending task result for task 0affffe7-6841-885f-bbcf-000000000612 10793 1726773085.10048: done sending task result for task 0affffe7-6841-885f-bbcf-000000000612 10793 1726773085.10051: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773085.10158: no more pending results, returning what we have 8240 1726773085.10161: results queue empty 8240 1726773085.10162: checking for any_errors_fatal 8240 1726773085.10169: done checking for any_errors_fatal 8240 1726773085.10170: checking for max_fail_percentage 8240 1726773085.10171: done checking for max_fail_percentage 8240 1726773085.10172: checking to see if all hosts have failed and the running result is not ok 8240 1726773085.10173: done checking to see if all hosts have failed 8240 1726773085.10174: getting the remaining hosts for this loop 8240 1726773085.10175: done getting the remaining hosts for this loop 8240 1726773085.10178: getting the next task for host managed_node2 8240 1726773085.10187: done getting next task for host managed_node2 8240 1726773085.10190: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8240 1726773085.10192: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773085.10208: getting variables 8240 1726773085.10210: in VariableManager get_vars() 8240 1726773085.10242: Calling all_inventory to load vars for managed_node2 8240 1726773085.10245: Calling groups_inventory to load vars for managed_node2 8240 1726773085.10247: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773085.10256: Calling all_plugins_play to load vars for managed_node2 8240 1726773085.10259: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773085.10261: Calling groups_plugins_play to load vars for managed_node2 8240 1726773085.10426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773085.10540: done with get_vars() 8240 1726773085.10548: done getting variables 8240 1726773085.10591: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.036) 0:01:03.750 **** 8240 1726773085.10617: entering _queue_task() for managed_node2/fail 8240 1726773085.10787: worker is 1 (out of 1 available) 8240 1726773085.10802: exiting _queue_task() for managed_node2/fail 8240 1726773085.10815: done queuing things up, now waiting for results queue to drain 8240 1726773085.10817: waiting for pending results... 10795 1726773085.10944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 10795 1726773085.11067: in run() - task 0affffe7-6841-885f-bbcf-000000000613 10795 1726773085.11084: variable 'ansible_search_path' from source: unknown 10795 1726773085.11090: variable 'ansible_search_path' from source: unknown 10795 1726773085.11120: calling self._execute() 10795 1726773085.11193: variable 'ansible_host' from source: host vars for 'managed_node2' 10795 1726773085.11204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10795 1726773085.11214: variable 'omit' from source: magic vars 10795 1726773085.11561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10795 1726773085.13095: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10795 1726773085.13151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10795 1726773085.13181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10795 1726773085.13212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10795 1726773085.13229: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10795 1726773085.13283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10795 1726773085.13313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10795 1726773085.13333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10795 1726773085.13359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10795 1726773085.13370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10795 1726773085.13449: variable '__kernel_settings_is_transactional' from source: set_fact 10795 1726773085.13466: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 10795 1726773085.13471: when evaluation is False, skipping this task 10795 1726773085.13474: _execute() done 10795 1726773085.13478: dumping result to json 10795 1726773085.13482: done dumping result, returning 10795 1726773085.13490: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-885f-bbcf-000000000613] 10795 1726773085.13496: sending task result for task 0affffe7-6841-885f-bbcf-000000000613 10795 1726773085.13521: done sending task result for task 0affffe7-6841-885f-bbcf-000000000613 10795 1726773085.13525: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773085.13637: no more pending results, returning what we have 8240 1726773085.13640: results queue empty 8240 1726773085.13641: checking for any_errors_fatal 8240 1726773085.13650: done checking for any_errors_fatal 8240 1726773085.13650: checking for max_fail_percentage 8240 1726773085.13652: done checking for max_fail_percentage 8240 1726773085.13652: checking to see if all hosts have failed and the running result is not ok 8240 1726773085.13653: done checking to see if all hosts have failed 8240 1726773085.13654: getting the remaining hosts for this loop 8240 1726773085.13655: done getting the remaining hosts for this loop 8240 1726773085.13658: getting the next task for host managed_node2 8240 1726773085.13667: done getting next task for host managed_node2 8240 1726773085.13670: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8240 1726773085.13673: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773085.13690: getting variables 8240 1726773085.13692: in VariableManager get_vars() 8240 1726773085.13729: Calling all_inventory to load vars for managed_node2 8240 1726773085.13732: Calling groups_inventory to load vars for managed_node2 8240 1726773085.13734: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773085.13743: Calling all_plugins_play to load vars for managed_node2 8240 1726773085.13746: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773085.13748: Calling groups_plugins_play to load vars for managed_node2 8240 1726773085.13869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773085.13992: done with get_vars() 8240 1726773085.14002: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.034) 0:01:03.784 **** 8240 1726773085.14062: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773085.14238: worker is 1 (out of 1 available) 8240 1726773085.14253: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773085.14267: done queuing things up, now waiting for results queue to drain 8240 1726773085.14269: waiting for pending results... 10796 1726773085.14394: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 10796 1726773085.14516: in run() - task 0affffe7-6841-885f-bbcf-000000000615 10796 1726773085.14534: variable 'ansible_search_path' from source: unknown 10796 1726773085.14538: variable 'ansible_search_path' from source: unknown 10796 1726773085.14566: calling self._execute() 10796 1726773085.14634: variable 'ansible_host' from source: host vars for 'managed_node2' 10796 1726773085.14644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10796 1726773085.14652: variable 'omit' from source: magic vars 10796 1726773085.14729: variable 'omit' from source: magic vars 10796 1726773085.14762: variable 'omit' from source: magic vars 10796 1726773085.14784: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10796 1726773085.14999: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 10796 1726773085.15060: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10796 1726773085.15093: variable 'omit' from source: magic vars 10796 1726773085.15125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10796 1726773085.15153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10796 1726773085.15172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10796 1726773085.15241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10796 1726773085.15253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10796 1726773085.15276: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10796 1726773085.15280: variable 'ansible_host' from source: host vars for 'managed_node2' 10796 1726773085.15283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10796 1726773085.15357: Set connection var ansible_pipelining to False 10796 1726773085.15366: Set connection var ansible_timeout to 10 10796 1726773085.15374: Set connection var ansible_module_compression to ZIP_DEFLATED 10796 1726773085.15377: Set connection var ansible_shell_type to sh 10796 1726773085.15382: Set connection var ansible_shell_executable to /bin/sh 10796 1726773085.15388: Set connection var ansible_connection to ssh 10796 1726773085.15406: variable 'ansible_shell_executable' from source: unknown 10796 1726773085.15411: variable 'ansible_connection' from source: unknown 10796 1726773085.15414: variable 'ansible_module_compression' from source: unknown 10796 1726773085.15417: variable 'ansible_shell_type' from source: unknown 10796 1726773085.15421: variable 'ansible_shell_executable' from source: unknown 10796 1726773085.15424: variable 'ansible_host' from source: host vars for 'managed_node2' 10796 1726773085.15428: variable 'ansible_pipelining' from source: unknown 10796 1726773085.15431: variable 'ansible_timeout' from source: unknown 10796 1726773085.15436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10796 1726773085.15560: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10796 1726773085.15572: variable 'omit' from source: magic vars 10796 1726773085.15579: starting attempt loop 10796 1726773085.15582: running the handler 10796 1726773085.15595: _low_level_execute_command(): starting 10796 1726773085.15605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10796 1726773085.17938: stdout chunk (state=2): >>>/root <<< 10796 1726773085.18053: stderr chunk (state=3): >>><<< 10796 1726773085.18060: stdout chunk (state=3): >>><<< 10796 1726773085.18078: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10796 1726773085.18095: _low_level_execute_command(): starting 10796 1726773085.18103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977 `" && echo ansible-tmp-1726773085.1808913-10796-29182717085977="` echo /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977 `" ) && sleep 0' 10796 1726773085.20802: stdout chunk (state=2): >>>ansible-tmp-1726773085.1808913-10796-29182717085977=/root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977 <<< 10796 1726773085.20934: stderr chunk (state=3): >>><<< 10796 1726773085.20942: stdout chunk (state=3): >>><<< 10796 1726773085.20958: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.1808913-10796-29182717085977=/root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977 , stderr= 10796 1726773085.20998: variable 'ansible_module_compression' from source: unknown 10796 1726773085.21033: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10796 1726773085.21063: variable 'ansible_facts' from source: unknown 10796 1726773085.21132: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/AnsiballZ_kernel_settings_get_config.py 10796 1726773085.21236: Sending initial data 10796 1726773085.21243: Sent initial data (173 bytes) 10796 1726773085.23881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp7r9olty1 /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/AnsiballZ_kernel_settings_get_config.py <<< 10796 1726773085.25692: stderr chunk (state=3): >>><<< 10796 1726773085.25702: stdout chunk (state=3): >>><<< 10796 1726773085.25727: done transferring module to remote 10796 1726773085.25740: _low_level_execute_command(): starting 10796 1726773085.25746: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/ /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10796 1726773085.28328: stderr chunk (state=2): >>><<< 10796 1726773085.28338: stdout chunk (state=2): >>><<< 10796 1726773085.28355: _low_level_execute_command() done: rc=0, stdout=, stderr= 10796 1726773085.28359: _low_level_execute_command(): starting 10796 1726773085.28365: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10796 1726773085.44115: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 10796 1726773085.45164: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10796 1726773085.45218: stderr chunk (state=3): >>><<< 10796 1726773085.45225: stdout chunk (state=3): >>><<< 10796 1726773085.45241: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 10796 1726773085.45270: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10796 1726773085.45281: _low_level_execute_command(): starting 10796 1726773085.45288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.1808913-10796-29182717085977/ > /dev/null 2>&1 && sleep 0' 10796 1726773085.47731: stderr chunk (state=2): >>><<< 10796 1726773085.47742: stdout chunk (state=2): >>><<< 10796 1726773085.47758: _low_level_execute_command() done: rc=0, stdout=, stderr= 10796 1726773085.47765: handler run complete 10796 1726773085.47781: attempt loop complete, returning result 10796 1726773085.47787: _execute() done 10796 1726773085.47790: dumping result to json 10796 1726773085.47795: done dumping result, returning 10796 1726773085.47805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-885f-bbcf-000000000615] 10796 1726773085.47811: sending task result for task 0affffe7-6841-885f-bbcf-000000000615 10796 1726773085.47842: done sending task result for task 0affffe7-6841-885f-bbcf-000000000615 10796 1726773085.47845: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8240 1726773085.47992: no more pending results, returning what we have 8240 1726773085.47996: results queue empty 8240 1726773085.47997: checking for any_errors_fatal 8240 1726773085.48003: done checking for any_errors_fatal 8240 1726773085.48004: checking for max_fail_percentage 8240 1726773085.48005: done checking for max_fail_percentage 8240 1726773085.48006: checking to see if all hosts have failed and the running result is not ok 8240 1726773085.48007: done checking to see if all hosts have failed 8240 1726773085.48007: getting the remaining hosts for this loop 8240 1726773085.48008: done getting the remaining hosts for this loop 8240 1726773085.48012: getting the next task for host managed_node2 8240 1726773085.48018: done getting next task for host managed_node2 8240 1726773085.48022: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8240 1726773085.48024: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773085.48036: getting variables 8240 1726773085.48038: in VariableManager get_vars() 8240 1726773085.48072: Calling all_inventory to load vars for managed_node2 8240 1726773085.48075: Calling groups_inventory to load vars for managed_node2 8240 1726773085.48077: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773085.48089: Calling all_plugins_play to load vars for managed_node2 8240 1726773085.48092: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773085.48095: Calling groups_plugins_play to load vars for managed_node2 8240 1726773085.48254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773085.48371: done with get_vars() 8240 1726773085.48379: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:25 -0400 (0:00:00.343) 0:01:04.128 **** 8240 1726773085.48449: entering _queue_task() for managed_node2/stat 8240 1726773085.48611: worker is 1 (out of 1 available) 8240 1726773085.48627: exiting _queue_task() for managed_node2/stat 8240 1726773085.48639: done queuing things up, now waiting for results queue to drain 8240 1726773085.48641: waiting for pending results... 10812 1726773085.48775: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 10812 1726773085.48897: in run() - task 0affffe7-6841-885f-bbcf-000000000616 10812 1726773085.48917: variable 'ansible_search_path' from source: unknown 10812 1726773085.48922: variable 'ansible_search_path' from source: unknown 10812 1726773085.48959: variable '__prof_from_conf' from source: task vars 10812 1726773085.49205: variable '__prof_from_conf' from source: task vars 10812 1726773085.49345: variable '__data' from source: task vars 10812 1726773085.49399: variable '__kernel_settings_register_tuned_main' from source: set_fact 10812 1726773085.49545: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10812 1726773085.49555: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10812 1726773085.49599: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10812 1726773085.49618: variable 'omit' from source: magic vars 10812 1726773085.49699: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.49712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.49721: variable 'omit' from source: magic vars 10812 1726773085.49894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10812 1726773085.51424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10812 1726773085.51470: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10812 1726773085.51503: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10812 1726773085.51530: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10812 1726773085.51551: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10812 1726773085.51611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10812 1726773085.51632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10812 1726773085.51651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10812 1726773085.51677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10812 1726773085.51691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10812 1726773085.51762: variable 'item' from source: unknown 10812 1726773085.51774: Evaluated conditional (item | length > 0): False 10812 1726773085.51779: when evaluation is False, skipping this task 10812 1726773085.51806: variable 'item' from source: unknown 10812 1726773085.51851: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 10812 1726773085.51930: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.51941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.51950: variable 'omit' from source: magic vars 10812 1726773085.52079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10812 1726773085.52099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10812 1726773085.52119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10812 1726773085.52145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10812 1726773085.52153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10812 1726773085.52208: variable 'item' from source: unknown 10812 1726773085.52217: Evaluated conditional (item | length > 0): True 10812 1726773085.52224: variable 'omit' from source: magic vars 10812 1726773085.52254: variable 'omit' from source: magic vars 10812 1726773085.52286: variable 'item' from source: unknown 10812 1726773085.52329: variable 'item' from source: unknown 10812 1726773085.52345: variable 'omit' from source: magic vars 10812 1726773085.52364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10812 1726773085.52387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10812 1726773085.52403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10812 1726773085.52416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10812 1726773085.52425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10812 1726773085.52448: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10812 1726773085.52453: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.52456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.52519: Set connection var ansible_pipelining to False 10812 1726773085.52524: Set connection var ansible_timeout to 10 10812 1726773085.52529: Set connection var ansible_module_compression to ZIP_DEFLATED 10812 1726773085.52531: Set connection var ansible_shell_type to sh 10812 1726773085.52534: Set connection var ansible_shell_executable to /bin/sh 10812 1726773085.52537: Set connection var ansible_connection to ssh 10812 1726773085.52548: variable 'ansible_shell_executable' from source: unknown 10812 1726773085.52550: variable 'ansible_connection' from source: unknown 10812 1726773085.52552: variable 'ansible_module_compression' from source: unknown 10812 1726773085.52553: variable 'ansible_shell_type' from source: unknown 10812 1726773085.52555: variable 'ansible_shell_executable' from source: unknown 10812 1726773085.52557: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.52561: variable 'ansible_pipelining' from source: unknown 10812 1726773085.52563: variable 'ansible_timeout' from source: unknown 10812 1726773085.52565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.52648: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10812 1726773085.52657: variable 'omit' from source: magic vars 10812 1726773085.52661: starting attempt loop 10812 1726773085.52663: running the handler 10812 1726773085.52673: _low_level_execute_command(): starting 10812 1726773085.52679: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10812 1726773085.55039: stdout chunk (state=2): >>>/root <<< 10812 1726773085.55164: stderr chunk (state=3): >>><<< 10812 1726773085.55171: stdout chunk (state=3): >>><<< 10812 1726773085.55191: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10812 1726773085.55203: _low_level_execute_command(): starting 10812 1726773085.55209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993 `" && echo ansible-tmp-1726773085.5519836-10812-210301571554993="` echo /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993 `" ) && sleep 0' 10812 1726773085.57747: stdout chunk (state=2): >>>ansible-tmp-1726773085.5519836-10812-210301571554993=/root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993 <<< 10812 1726773085.57941: stderr chunk (state=3): >>><<< 10812 1726773085.57948: stdout chunk (state=3): >>><<< 10812 1726773085.57967: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.5519836-10812-210301571554993=/root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993 , stderr= 10812 1726773085.58010: variable 'ansible_module_compression' from source: unknown 10812 1726773085.58053: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10812 1726773085.58080: variable 'ansible_facts' from source: unknown 10812 1726773085.58149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/AnsiballZ_stat.py 10812 1726773085.58249: Sending initial data 10812 1726773085.58256: Sent initial data (152 bytes) 10812 1726773085.60743: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpzpn86hja /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/AnsiballZ_stat.py <<< 10812 1726773085.62198: stderr chunk (state=3): >>><<< 10812 1726773085.62211: stdout chunk (state=3): >>><<< 10812 1726773085.62235: done transferring module to remote 10812 1726773085.62252: _low_level_execute_command(): starting 10812 1726773085.62258: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/ /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/AnsiballZ_stat.py && sleep 0' 10812 1726773085.64838: stderr chunk (state=2): >>><<< 10812 1726773085.64847: stdout chunk (state=2): >>><<< 10812 1726773085.64862: _low_level_execute_command() done: rc=0, stdout=, stderr= 10812 1726773085.64866: _low_level_execute_command(): starting 10812 1726773085.64872: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/AnsiballZ_stat.py && sleep 0' 10812 1726773085.80012: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10812 1726773085.81192: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10812 1726773085.81213: stderr chunk (state=3): >>><<< 10812 1726773085.81219: stdout chunk (state=3): >>><<< 10812 1726773085.81234: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 10812 1726773085.81256: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10812 1726773085.81269: _low_level_execute_command(): starting 10812 1726773085.81275: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.5519836-10812-210301571554993/ > /dev/null 2>&1 && sleep 0' 10812 1726773085.83689: stderr chunk (state=2): >>><<< 10812 1726773085.83697: stdout chunk (state=2): >>><<< 10812 1726773085.83712: _low_level_execute_command() done: rc=0, stdout=, stderr= 10812 1726773085.83719: handler run complete 10812 1726773085.83736: attempt loop complete, returning result 10812 1726773085.83752: variable 'item' from source: unknown 10812 1726773085.83813: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 10812 1726773085.83882: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.83890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.83896: variable 'omit' from source: magic vars 10812 1726773085.84009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10812 1726773085.84034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10812 1726773085.84057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10812 1726773085.84103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10812 1726773085.84116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10812 1726773085.84171: variable 'item' from source: unknown 10812 1726773085.84180: Evaluated conditional (item | length > 0): True 10812 1726773085.84187: variable 'omit' from source: magic vars 10812 1726773085.84202: variable 'omit' from source: magic vars 10812 1726773085.84230: variable 'item' from source: unknown 10812 1726773085.84271: variable 'item' from source: unknown 10812 1726773085.84282: variable 'omit' from source: magic vars 10812 1726773085.84305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10812 1726773085.84314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10812 1726773085.84321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10812 1726773085.84334: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10812 1726773085.84338: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.84342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.84393: Set connection var ansible_pipelining to False 10812 1726773085.84402: Set connection var ansible_timeout to 10 10812 1726773085.84410: Set connection var ansible_module_compression to ZIP_DEFLATED 10812 1726773085.84413: Set connection var ansible_shell_type to sh 10812 1726773085.84419: Set connection var ansible_shell_executable to /bin/sh 10812 1726773085.84424: Set connection var ansible_connection to ssh 10812 1726773085.84439: variable 'ansible_shell_executable' from source: unknown 10812 1726773085.84442: variable 'ansible_connection' from source: unknown 10812 1726773085.84445: variable 'ansible_module_compression' from source: unknown 10812 1726773085.84449: variable 'ansible_shell_type' from source: unknown 10812 1726773085.84452: variable 'ansible_shell_executable' from source: unknown 10812 1726773085.84455: variable 'ansible_host' from source: host vars for 'managed_node2' 10812 1726773085.84460: variable 'ansible_pipelining' from source: unknown 10812 1726773085.84463: variable 'ansible_timeout' from source: unknown 10812 1726773085.84467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10812 1726773085.84537: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10812 1726773085.84547: variable 'omit' from source: magic vars 10812 1726773085.84552: starting attempt loop 10812 1726773085.84555: running the handler 10812 1726773085.84562: _low_level_execute_command(): starting 10812 1726773085.84566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10812 1726773085.86751: stdout chunk (state=2): >>>/root <<< 10812 1726773085.86867: stderr chunk (state=3): >>><<< 10812 1726773085.86873: stdout chunk (state=3): >>><<< 10812 1726773085.86887: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10812 1726773085.86897: _low_level_execute_command(): starting 10812 1726773085.86903: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595 `" && echo ansible-tmp-1726773085.8689334-10812-218687907578595="` echo /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595 `" ) && sleep 0' 10812 1726773085.89456: stdout chunk (state=2): >>>ansible-tmp-1726773085.8689334-10812-218687907578595=/root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595 <<< 10812 1726773085.89587: stderr chunk (state=3): >>><<< 10812 1726773085.89594: stdout chunk (state=3): >>><<< 10812 1726773085.89608: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773085.8689334-10812-218687907578595=/root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595 , stderr= 10812 1726773085.89637: variable 'ansible_module_compression' from source: unknown 10812 1726773085.89673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10812 1726773085.89693: variable 'ansible_facts' from source: unknown 10812 1726773085.89749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/AnsiballZ_stat.py 10812 1726773085.89837: Sending initial data 10812 1726773085.89844: Sent initial data (152 bytes) 10812 1726773085.92404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp3lo586a9 /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/AnsiballZ_stat.py <<< 10812 1726773085.94064: stderr chunk (state=3): >>><<< 10812 1726773085.94073: stdout chunk (state=3): >>><<< 10812 1726773085.94097: done transferring module to remote 10812 1726773085.94107: _low_level_execute_command(): starting 10812 1726773085.94115: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/ /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/AnsiballZ_stat.py && sleep 0' 10812 1726773085.97234: stderr chunk (state=2): >>><<< 10812 1726773085.97245: stdout chunk (state=2): >>><<< 10812 1726773085.97262: _low_level_execute_command() done: rc=0, stdout=, stderr= 10812 1726773085.97267: _low_level_execute_command(): starting 10812 1726773085.97273: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/AnsiballZ_stat.py && sleep 0' 10812 1726773086.13302: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10812 1726773086.14418: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10812 1726773086.14462: stderr chunk (state=3): >>><<< 10812 1726773086.14469: stdout chunk (state=3): >>><<< 10812 1726773086.14483: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 10812 1726773086.14522: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10812 1726773086.14531: _low_level_execute_command(): starting 10812 1726773086.14537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773085.8689334-10812-218687907578595/ > /dev/null 2>&1 && sleep 0' 10812 1726773086.16930: stderr chunk (state=2): >>><<< 10812 1726773086.16940: stdout chunk (state=2): >>><<< 10812 1726773086.16954: _low_level_execute_command() done: rc=0, stdout=, stderr= 10812 1726773086.16960: handler run complete 10812 1726773086.16994: attempt loop complete, returning result 10812 1726773086.17014: variable 'item' from source: unknown 10812 1726773086.17072: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773042.2211215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773040.2991023, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773040.2991023, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10812 1726773086.17120: dumping result to json 10812 1726773086.17130: done dumping result, returning 10812 1726773086.17139: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-885f-bbcf-000000000616] 10812 1726773086.17144: sending task result for task 0affffe7-6841-885f-bbcf-000000000616 10812 1726773086.17184: done sending task result for task 0affffe7-6841-885f-bbcf-000000000616 10812 1726773086.17189: WORKER PROCESS EXITING 8240 1726773086.17372: no more pending results, returning what we have 8240 1726773086.17376: results queue empty 8240 1726773086.17377: checking for any_errors_fatal 8240 1726773086.17383: done checking for any_errors_fatal 8240 1726773086.17384: checking for max_fail_percentage 8240 1726773086.17387: done checking for max_fail_percentage 8240 1726773086.17388: checking to see if all hosts have failed and the running result is not ok 8240 1726773086.17389: done checking to see if all hosts have failed 8240 1726773086.17389: getting the remaining hosts for this loop 8240 1726773086.17390: done getting the remaining hosts for this loop 8240 1726773086.17394: getting the next task for host managed_node2 8240 1726773086.17400: done getting next task for host managed_node2 8240 1726773086.17403: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8240 1726773086.17405: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773086.17415: getting variables 8240 1726773086.17416: in VariableManager get_vars() 8240 1726773086.17449: Calling all_inventory to load vars for managed_node2 8240 1726773086.17452: Calling groups_inventory to load vars for managed_node2 8240 1726773086.17453: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773086.17462: Calling all_plugins_play to load vars for managed_node2 8240 1726773086.17465: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773086.17467: Calling groups_plugins_play to load vars for managed_node2 8240 1726773086.17576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773086.17721: done with get_vars() 8240 1726773086.17729: done getting variables 8240 1726773086.17771: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.693) 0:01:04.821 **** 8240 1726773086.17795: entering _queue_task() for managed_node2/set_fact 8240 1726773086.17958: worker is 1 (out of 1 available) 8240 1726773086.17973: exiting _queue_task() for managed_node2/set_fact 8240 1726773086.17989: done queuing things up, now waiting for results queue to drain 8240 1726773086.17990: waiting for pending results... 10847 1726773086.18120: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 10847 1726773086.18238: in run() - task 0affffe7-6841-885f-bbcf-000000000617 10847 1726773086.18255: variable 'ansible_search_path' from source: unknown 10847 1726773086.18259: variable 'ansible_search_path' from source: unknown 10847 1726773086.18288: calling self._execute() 10847 1726773086.18360: variable 'ansible_host' from source: host vars for 'managed_node2' 10847 1726773086.18369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10847 1726773086.18378: variable 'omit' from source: magic vars 10847 1726773086.18456: variable 'omit' from source: magic vars 10847 1726773086.18489: variable 'omit' from source: magic vars 10847 1726773086.18817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10847 1726773086.20322: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10847 1726773086.20377: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10847 1726773086.20411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10847 1726773086.20438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10847 1726773086.20458: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10847 1726773086.20520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10847 1726773086.20542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10847 1726773086.20561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10847 1726773086.20590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10847 1726773086.20604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10847 1726773086.20638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10847 1726773086.20656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10847 1726773086.20672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10847 1726773086.20699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10847 1726773086.20714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10847 1726773086.20755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10847 1726773086.20773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10847 1726773086.20793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10847 1726773086.20821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10847 1726773086.20834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10847 1726773086.20988: variable '__kernel_settings_find_profile_dirs' from source: set_fact 10847 1726773086.21056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10847 1726773086.21167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10847 1726773086.21196: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10847 1726773086.21221: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10847 1726773086.21242: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10847 1726773086.21275: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10847 1726773086.21293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10847 1726773086.21313: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10847 1726773086.21332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10847 1726773086.21370: variable 'omit' from source: magic vars 10847 1726773086.21397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10847 1726773086.21419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10847 1726773086.21435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10847 1726773086.21448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10847 1726773086.21458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10847 1726773086.21482: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10847 1726773086.21489: variable 'ansible_host' from source: host vars for 'managed_node2' 10847 1726773086.21494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10847 1726773086.21560: Set connection var ansible_pipelining to False 10847 1726773086.21568: Set connection var ansible_timeout to 10 10847 1726773086.21576: Set connection var ansible_module_compression to ZIP_DEFLATED 10847 1726773086.21579: Set connection var ansible_shell_type to sh 10847 1726773086.21586: Set connection var ansible_shell_executable to /bin/sh 10847 1726773086.21592: Set connection var ansible_connection to ssh 10847 1726773086.21611: variable 'ansible_shell_executable' from source: unknown 10847 1726773086.21616: variable 'ansible_connection' from source: unknown 10847 1726773086.21619: variable 'ansible_module_compression' from source: unknown 10847 1726773086.21623: variable 'ansible_shell_type' from source: unknown 10847 1726773086.21626: variable 'ansible_shell_executable' from source: unknown 10847 1726773086.21630: variable 'ansible_host' from source: host vars for 'managed_node2' 10847 1726773086.21634: variable 'ansible_pipelining' from source: unknown 10847 1726773086.21637: variable 'ansible_timeout' from source: unknown 10847 1726773086.21641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10847 1726773086.21707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10847 1726773086.21719: variable 'omit' from source: magic vars 10847 1726773086.21725: starting attempt loop 10847 1726773086.21728: running the handler 10847 1726773086.21738: handler run complete 10847 1726773086.21746: attempt loop complete, returning result 10847 1726773086.21749: _execute() done 10847 1726773086.21752: dumping result to json 10847 1726773086.21755: done dumping result, returning 10847 1726773086.21762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-885f-bbcf-000000000617] 10847 1726773086.21768: sending task result for task 0affffe7-6841-885f-bbcf-000000000617 10847 1726773086.21788: done sending task result for task 0affffe7-6841-885f-bbcf-000000000617 10847 1726773086.21791: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8240 1726773086.21923: no more pending results, returning what we have 8240 1726773086.21926: results queue empty 8240 1726773086.21927: checking for any_errors_fatal 8240 1726773086.21938: done checking for any_errors_fatal 8240 1726773086.21938: checking for max_fail_percentage 8240 1726773086.21940: done checking for max_fail_percentage 8240 1726773086.21940: checking to see if all hosts have failed and the running result is not ok 8240 1726773086.21941: done checking to see if all hosts have failed 8240 1726773086.21942: getting the remaining hosts for this loop 8240 1726773086.21943: done getting the remaining hosts for this loop 8240 1726773086.21946: getting the next task for host managed_node2 8240 1726773086.21953: done getting next task for host managed_node2 8240 1726773086.21955: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8240 1726773086.21958: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773086.21967: getting variables 8240 1726773086.21968: in VariableManager get_vars() 8240 1726773086.22006: Calling all_inventory to load vars for managed_node2 8240 1726773086.22009: Calling groups_inventory to load vars for managed_node2 8240 1726773086.22011: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773086.22020: Calling all_plugins_play to load vars for managed_node2 8240 1726773086.22023: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773086.22025: Calling groups_plugins_play to load vars for managed_node2 8240 1726773086.22174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773086.22342: done with get_vars() 8240 1726773086.22350: done getting variables 8240 1726773086.22399: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.046) 0:01:04.868 **** 8240 1726773086.22432: entering _queue_task() for managed_node2/service 8240 1726773086.22630: worker is 1 (out of 1 available) 8240 1726773086.22649: exiting _queue_task() for managed_node2/service 8240 1726773086.22665: done queuing things up, now waiting for results queue to drain 8240 1726773086.22667: waiting for pending results... 10848 1726773086.22797: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 10848 1726773086.22918: in run() - task 0affffe7-6841-885f-bbcf-000000000618 10848 1726773086.22934: variable 'ansible_search_path' from source: unknown 10848 1726773086.22938: variable 'ansible_search_path' from source: unknown 10848 1726773086.22972: variable '__kernel_settings_services' from source: include_vars 10848 1726773086.23211: variable '__kernel_settings_services' from source: include_vars 10848 1726773086.23270: variable 'omit' from source: magic vars 10848 1726773086.23364: variable 'ansible_host' from source: host vars for 'managed_node2' 10848 1726773086.23376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10848 1726773086.23388: variable 'omit' from source: magic vars 10848 1726773086.23464: variable 'omit' from source: magic vars 10848 1726773086.23506: variable 'omit' from source: magic vars 10848 1726773086.23539: variable 'item' from source: unknown 10848 1726773086.23618: variable 'item' from source: unknown 10848 1726773086.23639: variable 'omit' from source: magic vars 10848 1726773086.23677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10848 1726773086.23719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10848 1726773086.23738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10848 1726773086.23826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10848 1726773086.23840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10848 1726773086.23866: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10848 1726773086.23872: variable 'ansible_host' from source: host vars for 'managed_node2' 10848 1726773086.23875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10848 1726773086.23970: Set connection var ansible_pipelining to False 10848 1726773086.23982: Set connection var ansible_timeout to 10 10848 1726773086.23994: Set connection var ansible_module_compression to ZIP_DEFLATED 10848 1726773086.23998: Set connection var ansible_shell_type to sh 10848 1726773086.24005: Set connection var ansible_shell_executable to /bin/sh 10848 1726773086.24008: Set connection var ansible_connection to ssh 10848 1726773086.24024: variable 'ansible_shell_executable' from source: unknown 10848 1726773086.24028: variable 'ansible_connection' from source: unknown 10848 1726773086.24030: variable 'ansible_module_compression' from source: unknown 10848 1726773086.24032: variable 'ansible_shell_type' from source: unknown 10848 1726773086.24035: variable 'ansible_shell_executable' from source: unknown 10848 1726773086.24038: variable 'ansible_host' from source: host vars for 'managed_node2' 10848 1726773086.24040: variable 'ansible_pipelining' from source: unknown 10848 1726773086.24041: variable 'ansible_timeout' from source: unknown 10848 1726773086.24044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10848 1726773086.24154: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10848 1726773086.24165: variable 'omit' from source: magic vars 10848 1726773086.24170: starting attempt loop 10848 1726773086.24175: running the handler 10848 1726773086.24250: variable 'ansible_facts' from source: unknown 10848 1726773086.24349: _low_level_execute_command(): starting 10848 1726773086.24359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10848 1726773086.26806: stdout chunk (state=2): >>>/root <<< 10848 1726773086.26941: stderr chunk (state=3): >>><<< 10848 1726773086.26949: stdout chunk (state=3): >>><<< 10848 1726773086.26967: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10848 1726773086.26982: _low_level_execute_command(): starting 10848 1726773086.26993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318 `" && echo ansible-tmp-1726773086.269778-10848-208239036514318="` echo /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318 `" ) && sleep 0' 10848 1726773086.29991: stdout chunk (state=2): >>>ansible-tmp-1726773086.269778-10848-208239036514318=/root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318 <<< 10848 1726773086.30150: stderr chunk (state=3): >>><<< 10848 1726773086.30158: stdout chunk (state=3): >>><<< 10848 1726773086.30175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.269778-10848-208239036514318=/root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318 , stderr= 10848 1726773086.30208: variable 'ansible_module_compression' from source: unknown 10848 1726773086.30262: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10848 1726773086.30319: variable 'ansible_facts' from source: unknown 10848 1726773086.30549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/AnsiballZ_systemd.py 10848 1726773086.30903: Sending initial data 10848 1726773086.30909: Sent initial data (154 bytes) 10848 1726773086.33395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmplvlnv9n0 /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/AnsiballZ_systemd.py <<< 10848 1726773086.35648: stderr chunk (state=3): >>><<< 10848 1726773086.35658: stdout chunk (state=3): >>><<< 10848 1726773086.35682: done transferring module to remote 10848 1726773086.35700: _low_level_execute_command(): starting 10848 1726773086.35707: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/ /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/AnsiballZ_systemd.py && sleep 0' 10848 1726773086.38243: stderr chunk (state=2): >>><<< 10848 1726773086.38253: stdout chunk (state=2): >>><<< 10848 1726773086.38268: _low_level_execute_command() done: rc=0, stdout=, stderr= 10848 1726773086.38273: _low_level_execute_command(): starting 10848 1726773086.38280: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/AnsiballZ_systemd.py && sleep 0' 10848 1726773086.66247: stdout chunk (state=2): >>> <<< 10848 1726773086.66296: stdout chunk (state=3): >>>{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "20922368", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_<<< 10848 1726773086.66337: stdout chunk (state=3): >>>sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10848 1726773086.67913: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10848 1726773086.67956: stderr chunk (state=3): >>><<< 10848 1726773086.67962: stdout chunk (state=3): >>><<< 10848 1726773086.67982: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "20922368", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10848 1726773086.68089: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10848 1726773086.68112: _low_level_execute_command(): starting 10848 1726773086.68118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.269778-10848-208239036514318/ > /dev/null 2>&1 && sleep 0' 10848 1726773086.70541: stderr chunk (state=2): >>><<< 10848 1726773086.70549: stdout chunk (state=2): >>><<< 10848 1726773086.70566: _low_level_execute_command() done: rc=0, stdout=, stderr= 10848 1726773086.70574: handler run complete 10848 1726773086.70609: attempt loop complete, returning result 10848 1726773086.70628: variable 'item' from source: unknown 10848 1726773086.70690: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "20922368", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "WatchdogUSec": "0" } } 10848 1726773086.70788: dumping result to json 10848 1726773086.70806: done dumping result, returning 10848 1726773086.70815: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-000000000618] 10848 1726773086.70821: sending task result for task 0affffe7-6841-885f-bbcf-000000000618 10848 1726773086.70929: done sending task result for task 0affffe7-6841-885f-bbcf-000000000618 10848 1726773086.70934: WORKER PROCESS EXITING 8240 1726773086.71297: no more pending results, returning what we have 8240 1726773086.71302: results queue empty 8240 1726773086.71303: checking for any_errors_fatal 8240 1726773086.71307: done checking for any_errors_fatal 8240 1726773086.71307: checking for max_fail_percentage 8240 1726773086.71308: done checking for max_fail_percentage 8240 1726773086.71308: checking to see if all hosts have failed and the running result is not ok 8240 1726773086.71309: done checking to see if all hosts have failed 8240 1726773086.71309: getting the remaining hosts for this loop 8240 1726773086.71310: done getting the remaining hosts for this loop 8240 1726773086.71313: getting the next task for host managed_node2 8240 1726773086.71317: done getting next task for host managed_node2 8240 1726773086.71319: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8240 1726773086.71321: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773086.71328: getting variables 8240 1726773086.71330: in VariableManager get_vars() 8240 1726773086.71350: Calling all_inventory to load vars for managed_node2 8240 1726773086.71352: Calling groups_inventory to load vars for managed_node2 8240 1726773086.71353: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773086.71361: Calling all_plugins_play to load vars for managed_node2 8240 1726773086.71363: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773086.71364: Calling groups_plugins_play to load vars for managed_node2 8240 1726773086.71474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773086.71592: done with get_vars() 8240 1726773086.71602: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:26 -0400 (0:00:00.492) 0:01:05.360 **** 8240 1726773086.71669: entering _queue_task() for managed_node2/file 8240 1726773086.71836: worker is 1 (out of 1 available) 8240 1726773086.71850: exiting _queue_task() for managed_node2/file 8240 1726773086.71862: done queuing things up, now waiting for results queue to drain 8240 1726773086.71864: waiting for pending results... 10867 1726773086.71992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 10867 1726773086.72111: in run() - task 0affffe7-6841-885f-bbcf-000000000619 10867 1726773086.72128: variable 'ansible_search_path' from source: unknown 10867 1726773086.72132: variable 'ansible_search_path' from source: unknown 10867 1726773086.72159: calling self._execute() 10867 1726773086.72229: variable 'ansible_host' from source: host vars for 'managed_node2' 10867 1726773086.72238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10867 1726773086.72248: variable 'omit' from source: magic vars 10867 1726773086.72335: variable 'omit' from source: magic vars 10867 1726773086.72380: variable 'omit' from source: magic vars 10867 1726773086.72406: variable '__kernel_settings_profile_dir' from source: role '' all vars 10867 1726773086.72623: variable '__kernel_settings_profile_dir' from source: role '' all vars 10867 1726773086.72695: variable '__kernel_settings_profile_parent' from source: set_fact 10867 1726773086.72704: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10867 1726773086.72746: variable 'omit' from source: magic vars 10867 1726773086.72783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10867 1726773086.72817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10867 1726773086.72840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10867 1726773086.72857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10867 1726773086.72869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10867 1726773086.72921: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10867 1726773086.72929: variable 'ansible_host' from source: host vars for 'managed_node2' 10867 1726773086.72933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10867 1726773086.73032: Set connection var ansible_pipelining to False 10867 1726773086.73040: Set connection var ansible_timeout to 10 10867 1726773086.73048: Set connection var ansible_module_compression to ZIP_DEFLATED 10867 1726773086.73051: Set connection var ansible_shell_type to sh 10867 1726773086.73056: Set connection var ansible_shell_executable to /bin/sh 10867 1726773086.73061: Set connection var ansible_connection to ssh 10867 1726773086.73080: variable 'ansible_shell_executable' from source: unknown 10867 1726773086.73084: variable 'ansible_connection' from source: unknown 10867 1726773086.73091: variable 'ansible_module_compression' from source: unknown 10867 1726773086.73094: variable 'ansible_shell_type' from source: unknown 10867 1726773086.73096: variable 'ansible_shell_executable' from source: unknown 10867 1726773086.73099: variable 'ansible_host' from source: host vars for 'managed_node2' 10867 1726773086.73102: variable 'ansible_pipelining' from source: unknown 10867 1726773086.73105: variable 'ansible_timeout' from source: unknown 10867 1726773086.73109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10867 1726773086.73282: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10867 1726773086.73297: variable 'omit' from source: magic vars 10867 1726773086.73304: starting attempt loop 10867 1726773086.73308: running the handler 10867 1726773086.73321: _low_level_execute_command(): starting 10867 1726773086.73329: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10867 1726773086.75679: stdout chunk (state=2): >>>/root <<< 10867 1726773086.75797: stderr chunk (state=3): >>><<< 10867 1726773086.75806: stdout chunk (state=3): >>><<< 10867 1726773086.75824: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10867 1726773086.75838: _low_level_execute_command(): starting 10867 1726773086.75845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538 `" && echo ansible-tmp-1726773086.758332-10867-12586027046538="` echo /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538 `" ) && sleep 0' 10867 1726773086.78406: stdout chunk (state=2): >>>ansible-tmp-1726773086.758332-10867-12586027046538=/root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538 <<< 10867 1726773086.78534: stderr chunk (state=3): >>><<< 10867 1726773086.78540: stdout chunk (state=3): >>><<< 10867 1726773086.78555: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773086.758332-10867-12586027046538=/root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538 , stderr= 10867 1726773086.78595: variable 'ansible_module_compression' from source: unknown 10867 1726773086.78641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10867 1726773086.78670: variable 'ansible_facts' from source: unknown 10867 1726773086.78741: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/AnsiballZ_file.py 10867 1726773086.78844: Sending initial data 10867 1726773086.78852: Sent initial data (150 bytes) 10867 1726773086.81401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmprbfj7abo /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/AnsiballZ_file.py <<< 10867 1726773086.82544: stderr chunk (state=3): >>><<< 10867 1726773086.82553: stdout chunk (state=3): >>><<< 10867 1726773086.82573: done transferring module to remote 10867 1726773086.82584: _low_level_execute_command(): starting 10867 1726773086.82591: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/ /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/AnsiballZ_file.py && sleep 0' 10867 1726773086.84957: stderr chunk (state=2): >>><<< 10867 1726773086.84965: stdout chunk (state=2): >>><<< 10867 1726773086.84979: _low_level_execute_command() done: rc=0, stdout=, stderr= 10867 1726773086.84984: _low_level_execute_command(): starting 10867 1726773086.84990: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/AnsiballZ_file.py && sleep 0' 10867 1726773087.01105: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10867 1726773087.02229: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10867 1726773087.02279: stderr chunk (state=3): >>><<< 10867 1726773087.02287: stdout chunk (state=3): >>><<< 10867 1726773087.02305: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10867 1726773087.02337: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10867 1726773087.02348: _low_level_execute_command(): starting 10867 1726773087.02354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773086.758332-10867-12586027046538/ > /dev/null 2>&1 && sleep 0' 10867 1726773087.04789: stderr chunk (state=2): >>><<< 10867 1726773087.04796: stdout chunk (state=2): >>><<< 10867 1726773087.04811: _low_level_execute_command() done: rc=0, stdout=, stderr= 10867 1726773087.04819: handler run complete 10867 1726773087.04838: attempt loop complete, returning result 10867 1726773087.04844: _execute() done 10867 1726773087.04847: dumping result to json 10867 1726773087.04852: done dumping result, returning 10867 1726773087.04859: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-000000000619] 10867 1726773087.04865: sending task result for task 0affffe7-6841-885f-bbcf-000000000619 10867 1726773087.04900: done sending task result for task 0affffe7-6841-885f-bbcf-000000000619 10867 1726773087.04903: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8240 1726773087.05062: no more pending results, returning what we have 8240 1726773087.05065: results queue empty 8240 1726773087.05066: checking for any_errors_fatal 8240 1726773087.05081: done checking for any_errors_fatal 8240 1726773087.05082: checking for max_fail_percentage 8240 1726773087.05083: done checking for max_fail_percentage 8240 1726773087.05084: checking to see if all hosts have failed and the running result is not ok 8240 1726773087.05087: done checking to see if all hosts have failed 8240 1726773087.05087: getting the remaining hosts for this loop 8240 1726773087.05089: done getting the remaining hosts for this loop 8240 1726773087.05092: getting the next task for host managed_node2 8240 1726773087.05098: done getting next task for host managed_node2 8240 1726773087.05103: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8240 1726773087.05106: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773087.05115: getting variables 8240 1726773087.05117: in VariableManager get_vars() 8240 1726773087.05152: Calling all_inventory to load vars for managed_node2 8240 1726773087.05154: Calling groups_inventory to load vars for managed_node2 8240 1726773087.05156: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773087.05165: Calling all_plugins_play to load vars for managed_node2 8240 1726773087.05167: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773087.05169: Calling groups_plugins_play to load vars for managed_node2 8240 1726773087.05278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773087.05401: done with get_vars() 8240 1726773087.05410: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:27 -0400 (0:00:00.338) 0:01:05.698 **** 8240 1726773087.05477: entering _queue_task() for managed_node2/slurp 8240 1726773087.05649: worker is 1 (out of 1 available) 8240 1726773087.05665: exiting _queue_task() for managed_node2/slurp 8240 1726773087.05678: done queuing things up, now waiting for results queue to drain 8240 1726773087.05680: waiting for pending results... 10880 1726773087.05811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 10880 1726773087.05928: in run() - task 0affffe7-6841-885f-bbcf-00000000061a 10880 1726773087.05947: variable 'ansible_search_path' from source: unknown 10880 1726773087.05951: variable 'ansible_search_path' from source: unknown 10880 1726773087.05979: calling self._execute() 10880 1726773087.06049: variable 'ansible_host' from source: host vars for 'managed_node2' 10880 1726773087.06058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10880 1726773087.06067: variable 'omit' from source: magic vars 10880 1726773087.06144: variable 'omit' from source: magic vars 10880 1726773087.06177: variable 'omit' from source: magic vars 10880 1726773087.06200: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10880 1726773087.06422: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10880 1726773087.06482: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10880 1726773087.06513: variable 'omit' from source: magic vars 10880 1726773087.06546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10880 1726773087.06573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10880 1726773087.06594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10880 1726773087.06610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10880 1726773087.06622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10880 1726773087.06645: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10880 1726773087.06650: variable 'ansible_host' from source: host vars for 'managed_node2' 10880 1726773087.06654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10880 1726773087.06724: Set connection var ansible_pipelining to False 10880 1726773087.06731: Set connection var ansible_timeout to 10 10880 1726773087.06739: Set connection var ansible_module_compression to ZIP_DEFLATED 10880 1726773087.06742: Set connection var ansible_shell_type to sh 10880 1726773087.06748: Set connection var ansible_shell_executable to /bin/sh 10880 1726773087.06753: Set connection var ansible_connection to ssh 10880 1726773087.06769: variable 'ansible_shell_executable' from source: unknown 10880 1726773087.06774: variable 'ansible_connection' from source: unknown 10880 1726773087.06778: variable 'ansible_module_compression' from source: unknown 10880 1726773087.06781: variable 'ansible_shell_type' from source: unknown 10880 1726773087.06784: variable 'ansible_shell_executable' from source: unknown 10880 1726773087.06789: variable 'ansible_host' from source: host vars for 'managed_node2' 10880 1726773087.06793: variable 'ansible_pipelining' from source: unknown 10880 1726773087.06796: variable 'ansible_timeout' from source: unknown 10880 1726773087.06800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10880 1726773087.06940: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10880 1726773087.06952: variable 'omit' from source: magic vars 10880 1726773087.06958: starting attempt loop 10880 1726773087.06961: running the handler 10880 1726773087.06972: _low_level_execute_command(): starting 10880 1726773087.06979: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10880 1726773087.09301: stdout chunk (state=2): >>>/root <<< 10880 1726773087.09421: stderr chunk (state=3): >>><<< 10880 1726773087.09428: stdout chunk (state=3): >>><<< 10880 1726773087.09446: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10880 1726773087.09460: _low_level_execute_command(): starting 10880 1726773087.09466: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700 `" && echo ansible-tmp-1726773087.0945477-10880-68977977278700="` echo /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700 `" ) && sleep 0' 10880 1726773087.12071: stdout chunk (state=2): >>>ansible-tmp-1726773087.0945477-10880-68977977278700=/root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700 <<< 10880 1726773087.12207: stderr chunk (state=3): >>><<< 10880 1726773087.12214: stdout chunk (state=3): >>><<< 10880 1726773087.12228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773087.0945477-10880-68977977278700=/root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700 , stderr= 10880 1726773087.12264: variable 'ansible_module_compression' from source: unknown 10880 1726773087.12302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 10880 1726773087.12332: variable 'ansible_facts' from source: unknown 10880 1726773087.12406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/AnsiballZ_slurp.py 10880 1726773087.12588: Sending initial data 10880 1726773087.12596: Sent initial data (152 bytes) 10880 1726773087.15069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp4eyf398k /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/AnsiballZ_slurp.py <<< 10880 1726773087.16204: stderr chunk (state=3): >>><<< 10880 1726773087.16211: stdout chunk (state=3): >>><<< 10880 1726773087.16230: done transferring module to remote 10880 1726773087.16241: _low_level_execute_command(): starting 10880 1726773087.16246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/ /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/AnsiballZ_slurp.py && sleep 0' 10880 1726773087.18750: stderr chunk (state=2): >>><<< 10880 1726773087.18760: stdout chunk (state=2): >>><<< 10880 1726773087.18778: _low_level_execute_command() done: rc=0, stdout=, stderr= 10880 1726773087.18785: _low_level_execute_command(): starting 10880 1726773087.18791: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/AnsiballZ_slurp.py && sleep 0' 10880 1726773087.33865: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 10880 1726773087.34893: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10880 1726773087.34938: stderr chunk (state=3): >>><<< 10880 1726773087.34946: stdout chunk (state=3): >>><<< 10880 1726773087.34961: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 10880 1726773087.34988: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10880 1726773087.34999: _low_level_execute_command(): starting 10880 1726773087.35005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773087.0945477-10880-68977977278700/ > /dev/null 2>&1 && sleep 0' 10880 1726773087.37412: stderr chunk (state=2): >>><<< 10880 1726773087.37420: stdout chunk (state=2): >>><<< 10880 1726773087.37433: _low_level_execute_command() done: rc=0, stdout=, stderr= 10880 1726773087.37440: handler run complete 10880 1726773087.37451: attempt loop complete, returning result 10880 1726773087.37454: _execute() done 10880 1726773087.37456: dumping result to json 10880 1726773087.37458: done dumping result, returning 10880 1726773087.37463: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-885f-bbcf-00000000061a] 10880 1726773087.37467: sending task result for task 0affffe7-6841-885f-bbcf-00000000061a 10880 1726773087.37498: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061a 10880 1726773087.37501: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773087.37745: no more pending results, returning what we have 8240 1726773087.37747: results queue empty 8240 1726773087.37748: checking for any_errors_fatal 8240 1726773087.37756: done checking for any_errors_fatal 8240 1726773087.37757: checking for max_fail_percentage 8240 1726773087.37758: done checking for max_fail_percentage 8240 1726773087.37758: checking to see if all hosts have failed and the running result is not ok 8240 1726773087.37759: done checking to see if all hosts have failed 8240 1726773087.37759: getting the remaining hosts for this loop 8240 1726773087.37760: done getting the remaining hosts for this loop 8240 1726773087.37763: getting the next task for host managed_node2 8240 1726773087.37768: done getting next task for host managed_node2 8240 1726773087.37771: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8240 1726773087.37772: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773087.37779: getting variables 8240 1726773087.37780: in VariableManager get_vars() 8240 1726773087.37814: Calling all_inventory to load vars for managed_node2 8240 1726773087.37816: Calling groups_inventory to load vars for managed_node2 8240 1726773087.37818: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773087.37826: Calling all_plugins_play to load vars for managed_node2 8240 1726773087.37828: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773087.37829: Calling groups_plugins_play to load vars for managed_node2 8240 1726773087.37938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773087.38061: done with get_vars() 8240 1726773087.38069: done getting variables 8240 1726773087.38117: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:27 -0400 (0:00:00.326) 0:01:06.025 **** 8240 1726773087.38142: entering _queue_task() for managed_node2/set_fact 8240 1726773087.38316: worker is 1 (out of 1 available) 8240 1726773087.38330: exiting _queue_task() for managed_node2/set_fact 8240 1726773087.38345: done queuing things up, now waiting for results queue to drain 8240 1726773087.38346: waiting for pending results... 10899 1726773087.38478: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 10899 1726773087.38598: in run() - task 0affffe7-6841-885f-bbcf-00000000061b 10899 1726773087.38616: variable 'ansible_search_path' from source: unknown 10899 1726773087.38620: variable 'ansible_search_path' from source: unknown 10899 1726773087.38647: calling self._execute() 10899 1726773087.38801: variable 'ansible_host' from source: host vars for 'managed_node2' 10899 1726773087.38810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10899 1726773087.38819: variable 'omit' from source: magic vars 10899 1726773087.38891: variable 'omit' from source: magic vars 10899 1726773087.38926: variable 'omit' from source: magic vars 10899 1726773087.39207: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10899 1726773087.39218: variable '__cur_profile' from source: task vars 10899 1726773087.39326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10899 1726773087.40808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10899 1726773087.41046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10899 1726773087.41077: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10899 1726773087.41105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10899 1726773087.41126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10899 1726773087.41183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10899 1726773087.41208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10899 1726773087.41227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10899 1726773087.41254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10899 1726773087.41265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10899 1726773087.41343: variable '__kernel_settings_tuned_current_profile' from source: set_fact 10899 1726773087.41387: variable 'omit' from source: magic vars 10899 1726773087.41412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10899 1726773087.41437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10899 1726773087.41453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10899 1726773087.41466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10899 1726773087.41476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10899 1726773087.41508: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10899 1726773087.41514: variable 'ansible_host' from source: host vars for 'managed_node2' 10899 1726773087.41519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10899 1726773087.41583: Set connection var ansible_pipelining to False 10899 1726773087.41592: Set connection var ansible_timeout to 10 10899 1726773087.41602: Set connection var ansible_module_compression to ZIP_DEFLATED 10899 1726773087.41606: Set connection var ansible_shell_type to sh 10899 1726773087.41611: Set connection var ansible_shell_executable to /bin/sh 10899 1726773087.41616: Set connection var ansible_connection to ssh 10899 1726773087.41633: variable 'ansible_shell_executable' from source: unknown 10899 1726773087.41637: variable 'ansible_connection' from source: unknown 10899 1726773087.41640: variable 'ansible_module_compression' from source: unknown 10899 1726773087.41643: variable 'ansible_shell_type' from source: unknown 10899 1726773087.41647: variable 'ansible_shell_executable' from source: unknown 10899 1726773087.41650: variable 'ansible_host' from source: host vars for 'managed_node2' 10899 1726773087.41655: variable 'ansible_pipelining' from source: unknown 10899 1726773087.41658: variable 'ansible_timeout' from source: unknown 10899 1726773087.41662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10899 1726773087.41728: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10899 1726773087.41739: variable 'omit' from source: magic vars 10899 1726773087.41744: starting attempt loop 10899 1726773087.41748: running the handler 10899 1726773087.41757: handler run complete 10899 1726773087.41765: attempt loop complete, returning result 10899 1726773087.41768: _execute() done 10899 1726773087.41771: dumping result to json 10899 1726773087.41775: done dumping result, returning 10899 1726773087.41781: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-885f-bbcf-00000000061b] 10899 1726773087.41788: sending task result for task 0affffe7-6841-885f-bbcf-00000000061b 10899 1726773087.41810: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061b 10899 1726773087.41813: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8240 1726773087.42032: no more pending results, returning what we have 8240 1726773087.42035: results queue empty 8240 1726773087.42036: checking for any_errors_fatal 8240 1726773087.42040: done checking for any_errors_fatal 8240 1726773087.42040: checking for max_fail_percentage 8240 1726773087.42042: done checking for max_fail_percentage 8240 1726773087.42042: checking to see if all hosts have failed and the running result is not ok 8240 1726773087.42043: done checking to see if all hosts have failed 8240 1726773087.42044: getting the remaining hosts for this loop 8240 1726773087.42045: done getting the remaining hosts for this loop 8240 1726773087.42048: getting the next task for host managed_node2 8240 1726773087.42054: done getting next task for host managed_node2 8240 1726773087.42056: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8240 1726773087.42059: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773087.42073: getting variables 8240 1726773087.42075: in VariableManager get_vars() 8240 1726773087.42098: Calling all_inventory to load vars for managed_node2 8240 1726773087.42100: Calling groups_inventory to load vars for managed_node2 8240 1726773087.42102: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773087.42110: Calling all_plugins_play to load vars for managed_node2 8240 1726773087.42112: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773087.42113: Calling groups_plugins_play to load vars for managed_node2 8240 1726773087.42208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773087.42323: done with get_vars() 8240 1726773087.42330: done getting variables 8240 1726773087.42370: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:27 -0400 (0:00:00.042) 0:01:06.067 **** 8240 1726773087.42396: entering _queue_task() for managed_node2/copy 8240 1726773087.42557: worker is 1 (out of 1 available) 8240 1726773087.42572: exiting _queue_task() for managed_node2/copy 8240 1726773087.42586: done queuing things up, now waiting for results queue to drain 8240 1726773087.42588: waiting for pending results... 10900 1726773087.42721: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 10900 1726773087.42843: in run() - task 0affffe7-6841-885f-bbcf-00000000061c 10900 1726773087.42860: variable 'ansible_search_path' from source: unknown 10900 1726773087.42864: variable 'ansible_search_path' from source: unknown 10900 1726773087.42894: calling self._execute() 10900 1726773087.42967: variable 'ansible_host' from source: host vars for 'managed_node2' 10900 1726773087.42975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10900 1726773087.42987: variable 'omit' from source: magic vars 10900 1726773087.43065: variable 'omit' from source: magic vars 10900 1726773087.43106: variable 'omit' from source: magic vars 10900 1726773087.43127: variable '__kernel_settings_active_profile' from source: set_fact 10900 1726773087.43351: variable '__kernel_settings_active_profile' from source: set_fact 10900 1726773087.43374: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10900 1726773087.43430: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 10900 1726773087.43482: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10900 1726773087.43510: variable 'omit' from source: magic vars 10900 1726773087.43541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10900 1726773087.43567: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10900 1726773087.43584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10900 1726773087.43607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10900 1726773087.43618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10900 1726773087.43642: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10900 1726773087.43647: variable 'ansible_host' from source: host vars for 'managed_node2' 10900 1726773087.43651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10900 1726773087.43723: Set connection var ansible_pipelining to False 10900 1726773087.43731: Set connection var ansible_timeout to 10 10900 1726773087.43738: Set connection var ansible_module_compression to ZIP_DEFLATED 10900 1726773087.43742: Set connection var ansible_shell_type to sh 10900 1726773087.43747: Set connection var ansible_shell_executable to /bin/sh 10900 1726773087.43751: Set connection var ansible_connection to ssh 10900 1726773087.43766: variable 'ansible_shell_executable' from source: unknown 10900 1726773087.43770: variable 'ansible_connection' from source: unknown 10900 1726773087.43773: variable 'ansible_module_compression' from source: unknown 10900 1726773087.43776: variable 'ansible_shell_type' from source: unknown 10900 1726773087.43780: variable 'ansible_shell_executable' from source: unknown 10900 1726773087.43784: variable 'ansible_host' from source: host vars for 'managed_node2' 10900 1726773087.43790: variable 'ansible_pipelining' from source: unknown 10900 1726773087.43792: variable 'ansible_timeout' from source: unknown 10900 1726773087.43794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10900 1726773087.43880: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10900 1726773087.43892: variable 'omit' from source: magic vars 10900 1726773087.43897: starting attempt loop 10900 1726773087.43902: running the handler 10900 1726773087.43912: _low_level_execute_command(): starting 10900 1726773087.43919: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10900 1726773087.46287: stdout chunk (state=2): >>>/root <<< 10900 1726773087.46415: stderr chunk (state=3): >>><<< 10900 1726773087.46423: stdout chunk (state=3): >>><<< 10900 1726773087.46442: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10900 1726773087.46455: _low_level_execute_command(): starting 10900 1726773087.46461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966 `" && echo ansible-tmp-1726773087.464503-10900-70577723026966="` echo /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966 `" ) && sleep 0' 10900 1726773087.49068: stdout chunk (state=2): >>>ansible-tmp-1726773087.464503-10900-70577723026966=/root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966 <<< 10900 1726773087.49192: stderr chunk (state=3): >>><<< 10900 1726773087.49200: stdout chunk (state=3): >>><<< 10900 1726773087.49215: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773087.464503-10900-70577723026966=/root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966 , stderr= 10900 1726773087.49288: variable 'ansible_module_compression' from source: unknown 10900 1726773087.49333: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10900 1726773087.49361: variable 'ansible_facts' from source: unknown 10900 1726773087.49432: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_stat.py 10900 1726773087.49521: Sending initial data 10900 1726773087.49529: Sent initial data (150 bytes) 10900 1726773087.52073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp_risu0yh /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_stat.py <<< 10900 1726773087.53562: stderr chunk (state=3): >>><<< 10900 1726773087.53574: stdout chunk (state=3): >>><<< 10900 1726773087.53600: done transferring module to remote 10900 1726773087.53615: _low_level_execute_command(): starting 10900 1726773087.53622: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/ /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_stat.py && sleep 0' 10900 1726773087.56197: stderr chunk (state=2): >>><<< 10900 1726773087.56209: stdout chunk (state=2): >>><<< 10900 1726773087.56226: _low_level_execute_command() done: rc=0, stdout=, stderr= 10900 1726773087.56231: _low_level_execute_command(): starting 10900 1726773087.56237: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_stat.py && sleep 0' 10900 1726773087.72366: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773087.3360043, "mtime": 1726773079.86693, "ctime": 1726773079.86693, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10900 1726773087.73526: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10900 1726773087.73574: stderr chunk (state=3): >>><<< 10900 1726773087.73581: stdout chunk (state=3): >>><<< 10900 1726773087.73599: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773087.3360043, "mtime": 1726773079.86693, "ctime": 1726773079.86693, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10900 1726773087.73643: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10900 1726773087.73680: variable 'ansible_module_compression' from source: unknown 10900 1726773087.73715: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10900 1726773087.73733: variable 'ansible_facts' from source: unknown 10900 1726773087.73792: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_file.py 10900 1726773087.73881: Sending initial data 10900 1726773087.73890: Sent initial data (150 bytes) 10900 1726773087.76448: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpwnhk1te_ /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_file.py <<< 10900 1726773087.77596: stderr chunk (state=3): >>><<< 10900 1726773087.77606: stdout chunk (state=3): >>><<< 10900 1726773087.77626: done transferring module to remote 10900 1726773087.77635: _low_level_execute_command(): starting 10900 1726773087.77640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/ /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_file.py && sleep 0' 10900 1726773087.79990: stderr chunk (state=2): >>><<< 10900 1726773087.80002: stdout chunk (state=2): >>><<< 10900 1726773087.80016: _low_level_execute_command() done: rc=0, stdout=, stderr= 10900 1726773087.80022: _low_level_execute_command(): starting 10900 1726773087.80028: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/AnsiballZ_file.py && sleep 0' 10900 1726773087.95964: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpdommy3dv", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10900 1726773087.97121: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10900 1726773087.97133: stdout chunk (state=3): >>><<< 10900 1726773087.97147: stderr chunk (state=3): >>><<< 10900 1726773087.97160: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpdommy3dv", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10900 1726773087.97198: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpdommy3dv', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10900 1726773087.97211: _low_level_execute_command(): starting 10900 1726773087.97217: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773087.464503-10900-70577723026966/ > /dev/null 2>&1 && sleep 0' 10900 1726773087.99720: stderr chunk (state=2): >>><<< 10900 1726773087.99731: stdout chunk (state=2): >>><<< 10900 1726773087.99748: _low_level_execute_command() done: rc=0, stdout=, stderr= 10900 1726773087.99757: handler run complete 10900 1726773087.99777: attempt loop complete, returning result 10900 1726773087.99781: _execute() done 10900 1726773087.99784: dumping result to json 10900 1726773087.99791: done dumping result, returning 10900 1726773087.99798: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-885f-bbcf-00000000061c] 10900 1726773087.99804: sending task result for task 0affffe7-6841-885f-bbcf-00000000061c 10900 1726773087.99840: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061c 10900 1726773087.99844: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8240 1726773088.00026: no more pending results, returning what we have 8240 1726773088.00029: results queue empty 8240 1726773088.00030: checking for any_errors_fatal 8240 1726773088.00038: done checking for any_errors_fatal 8240 1726773088.00038: checking for max_fail_percentage 8240 1726773088.00040: done checking for max_fail_percentage 8240 1726773088.00041: checking to see if all hosts have failed and the running result is not ok 8240 1726773088.00042: done checking to see if all hosts have failed 8240 1726773088.00042: getting the remaining hosts for this loop 8240 1726773088.00044: done getting the remaining hosts for this loop 8240 1726773088.00047: getting the next task for host managed_node2 8240 1726773088.00053: done getting next task for host managed_node2 8240 1726773088.00056: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8240 1726773088.00059: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773088.00069: getting variables 8240 1726773088.00070: in VariableManager get_vars() 8240 1726773088.00124: Calling all_inventory to load vars for managed_node2 8240 1726773088.00127: Calling groups_inventory to load vars for managed_node2 8240 1726773088.00129: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773088.00139: Calling all_plugins_play to load vars for managed_node2 8240 1726773088.00142: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773088.00145: Calling groups_plugins_play to load vars for managed_node2 8240 1726773088.00317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773088.00564: done with get_vars() 8240 1726773088.00575: done getting variables 8240 1726773088.00637: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:28 -0400 (0:00:00.582) 0:01:06.650 **** 8240 1726773088.00670: entering _queue_task() for managed_node2/copy 8240 1726773088.00913: worker is 1 (out of 1 available) 8240 1726773088.00928: exiting _queue_task() for managed_node2/copy 8240 1726773088.00942: done queuing things up, now waiting for results queue to drain 8240 1726773088.00944: waiting for pending results... 10927 1726773088.01075: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 10927 1726773088.01199: in run() - task 0affffe7-6841-885f-bbcf-00000000061d 10927 1726773088.01218: variable 'ansible_search_path' from source: unknown 10927 1726773088.01222: variable 'ansible_search_path' from source: unknown 10927 1726773088.01250: calling self._execute() 10927 1726773088.01324: variable 'ansible_host' from source: host vars for 'managed_node2' 10927 1726773088.01333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10927 1726773088.01342: variable 'omit' from source: magic vars 10927 1726773088.01427: variable 'omit' from source: magic vars 10927 1726773088.01464: variable 'omit' from source: magic vars 10927 1726773088.01490: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10927 1726773088.01717: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 10927 1726773088.01775: variable '__kernel_settings_tuned_dir' from source: role '' all vars 10927 1726773088.01810: variable 'omit' from source: magic vars 10927 1726773088.01844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10927 1726773088.01871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10927 1726773088.01894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10927 1726773088.01909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10927 1726773088.01922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10927 1726773088.01947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10927 1726773088.01953: variable 'ansible_host' from source: host vars for 'managed_node2' 10927 1726773088.01957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10927 1726773088.02031: Set connection var ansible_pipelining to False 10927 1726773088.02039: Set connection var ansible_timeout to 10 10927 1726773088.02046: Set connection var ansible_module_compression to ZIP_DEFLATED 10927 1726773088.02048: Set connection var ansible_shell_type to sh 10927 1726773088.02051: Set connection var ansible_shell_executable to /bin/sh 10927 1726773088.02054: Set connection var ansible_connection to ssh 10927 1726773088.02068: variable 'ansible_shell_executable' from source: unknown 10927 1726773088.02070: variable 'ansible_connection' from source: unknown 10927 1726773088.02072: variable 'ansible_module_compression' from source: unknown 10927 1726773088.02074: variable 'ansible_shell_type' from source: unknown 10927 1726773088.02075: variable 'ansible_shell_executable' from source: unknown 10927 1726773088.02077: variable 'ansible_host' from source: host vars for 'managed_node2' 10927 1726773088.02079: variable 'ansible_pipelining' from source: unknown 10927 1726773088.02081: variable 'ansible_timeout' from source: unknown 10927 1726773088.02083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10927 1726773088.02174: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10927 1726773088.02184: variable 'omit' from source: magic vars 10927 1726773088.02192: starting attempt loop 10927 1726773088.02194: running the handler 10927 1726773088.02204: _low_level_execute_command(): starting 10927 1726773088.02210: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10927 1726773088.04602: stdout chunk (state=2): >>>/root <<< 10927 1726773088.04729: stderr chunk (state=3): >>><<< 10927 1726773088.04738: stdout chunk (state=3): >>><<< 10927 1726773088.04758: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10927 1726773088.04773: _low_level_execute_command(): starting 10927 1726773088.04780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142 `" && echo ansible-tmp-1726773088.0476704-10927-14612075602142="` echo /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142 `" ) && sleep 0' 10927 1726773088.07465: stdout chunk (state=2): >>>ansible-tmp-1726773088.0476704-10927-14612075602142=/root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142 <<< 10927 1726773088.07598: stderr chunk (state=3): >>><<< 10927 1726773088.07608: stdout chunk (state=3): >>><<< 10927 1726773088.07625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773088.0476704-10927-14612075602142=/root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142 , stderr= 10927 1726773088.07699: variable 'ansible_module_compression' from source: unknown 10927 1726773088.07747: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10927 1726773088.07776: variable 'ansible_facts' from source: unknown 10927 1726773088.07847: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_stat.py 10927 1726773088.07942: Sending initial data 10927 1726773088.07949: Sent initial data (151 bytes) 10927 1726773088.10490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpdczgfqri /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_stat.py <<< 10927 1726773088.11602: stderr chunk (state=3): >>><<< 10927 1726773088.11613: stdout chunk (state=3): >>><<< 10927 1726773088.11633: done transferring module to remote 10927 1726773088.11645: _low_level_execute_command(): starting 10927 1726773088.11650: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/ /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_stat.py && sleep 0' 10927 1726773088.14046: stderr chunk (state=2): >>><<< 10927 1726773088.14057: stdout chunk (state=2): >>><<< 10927 1726773088.14073: _low_level_execute_command() done: rc=0, stdout=, stderr= 10927 1726773088.14078: _low_level_execute_command(): starting 10927 1726773088.14083: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_stat.py && sleep 0' 10927 1726773088.30594: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773078.0252256, "mtime": 1726773079.86693, "ctime": 1726773079.86693, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10927 1726773088.31482: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10927 1726773088.31531: stderr chunk (state=3): >>><<< 10927 1726773088.31538: stdout chunk (state=3): >>><<< 10927 1726773088.31554: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773078.0252256, "mtime": 1726773079.86693, "ctime": 1726773079.86693, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10927 1726773088.31599: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10927 1726773088.31640: variable 'ansible_module_compression' from source: unknown 10927 1726773088.31670: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 10927 1726773088.31690: variable 'ansible_facts' from source: unknown 10927 1726773088.31751: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_file.py 10927 1726773088.31845: Sending initial data 10927 1726773088.31854: Sent initial data (151 bytes) 10927 1726773088.34445: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpmkl7d5bj /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_file.py <<< 10927 1726773088.35611: stderr chunk (state=3): >>><<< 10927 1726773088.35622: stdout chunk (state=3): >>><<< 10927 1726773088.35641: done transferring module to remote 10927 1726773088.35651: _low_level_execute_command(): starting 10927 1726773088.35656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/ /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_file.py && sleep 0' 10927 1726773088.38058: stderr chunk (state=2): >>><<< 10927 1726773088.38068: stdout chunk (state=2): >>><<< 10927 1726773088.38083: _low_level_execute_command() done: rc=0, stdout=, stderr= 10927 1726773088.38089: _low_level_execute_command(): starting 10927 1726773088.38096: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/AnsiballZ_file.py && sleep 0' 10927 1726773088.54097: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp7w9139ct", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10927 1726773088.55206: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10927 1726773088.55254: stderr chunk (state=3): >>><<< 10927 1726773088.55261: stdout chunk (state=3): >>><<< 10927 1726773088.55278: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmp7w9139ct", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10927 1726773088.55311: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmp7w9139ct', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10927 1726773088.55323: _low_level_execute_command(): starting 10927 1726773088.55329: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773088.0476704-10927-14612075602142/ > /dev/null 2>&1 && sleep 0' 10927 1726773088.57749: stderr chunk (state=2): >>><<< 10927 1726773088.57759: stdout chunk (state=2): >>><<< 10927 1726773088.57776: _low_level_execute_command() done: rc=0, stdout=, stderr= 10927 1726773088.57787: handler run complete 10927 1726773088.57811: attempt loop complete, returning result 10927 1726773088.57816: _execute() done 10927 1726773088.57819: dumping result to json 10927 1726773088.57825: done dumping result, returning 10927 1726773088.57832: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-885f-bbcf-00000000061d] 10927 1726773088.57838: sending task result for task 0affffe7-6841-885f-bbcf-00000000061d 10927 1726773088.57870: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061d 10927 1726773088.57874: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8240 1726773088.58152: no more pending results, returning what we have 8240 1726773088.58155: results queue empty 8240 1726773088.58156: checking for any_errors_fatal 8240 1726773088.58162: done checking for any_errors_fatal 8240 1726773088.58162: checking for max_fail_percentage 8240 1726773088.58163: done checking for max_fail_percentage 8240 1726773088.58164: checking to see if all hosts have failed and the running result is not ok 8240 1726773088.58165: done checking to see if all hosts have failed 8240 1726773088.58165: getting the remaining hosts for this loop 8240 1726773088.58168: done getting the remaining hosts for this loop 8240 1726773088.58171: getting the next task for host managed_node2 8240 1726773088.58176: done getting next task for host managed_node2 8240 1726773088.58179: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8240 1726773088.58181: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773088.58192: getting variables 8240 1726773088.58193: in VariableManager get_vars() 8240 1726773088.58221: Calling all_inventory to load vars for managed_node2 8240 1726773088.58223: Calling groups_inventory to load vars for managed_node2 8240 1726773088.58224: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773088.58232: Calling all_plugins_play to load vars for managed_node2 8240 1726773088.58234: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773088.58236: Calling groups_plugins_play to load vars for managed_node2 8240 1726773088.58346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773088.58464: done with get_vars() 8240 1726773088.58473: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:28 -0400 (0:00:00.578) 0:01:07.229 **** 8240 1726773088.58538: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773088.58709: worker is 1 (out of 1 available) 8240 1726773088.58723: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773088.58737: done queuing things up, now waiting for results queue to drain 8240 1726773088.58739: waiting for pending results... 10951 1726773088.58875: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 10951 1726773088.58996: in run() - task 0affffe7-6841-885f-bbcf-00000000061e 10951 1726773088.59014: variable 'ansible_search_path' from source: unknown 10951 1726773088.59017: variable 'ansible_search_path' from source: unknown 10951 1726773088.59043: calling self._execute() 10951 1726773088.59115: variable 'ansible_host' from source: host vars for 'managed_node2' 10951 1726773088.59123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10951 1726773088.59128: variable 'omit' from source: magic vars 10951 1726773088.59209: variable 'omit' from source: magic vars 10951 1726773088.59243: variable 'omit' from source: magic vars 10951 1726773088.59264: variable '__kernel_settings_profile_filename' from source: role '' all vars 10951 1726773088.59492: variable '__kernel_settings_profile_filename' from source: role '' all vars 10951 1726773088.59554: variable '__kernel_settings_profile_dir' from source: role '' all vars 10951 1726773088.59618: variable '__kernel_settings_profile_parent' from source: set_fact 10951 1726773088.59627: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10951 1726773088.59717: variable 'omit' from source: magic vars 10951 1726773088.59751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10951 1726773088.59778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10951 1726773088.59800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10951 1726773088.59817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10951 1726773088.59828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10951 1726773088.59853: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10951 1726773088.59859: variable 'ansible_host' from source: host vars for 'managed_node2' 10951 1726773088.59864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10951 1726773088.59936: Set connection var ansible_pipelining to False 10951 1726773088.59943: Set connection var ansible_timeout to 10 10951 1726773088.59951: Set connection var ansible_module_compression to ZIP_DEFLATED 10951 1726773088.59955: Set connection var ansible_shell_type to sh 10951 1726773088.59960: Set connection var ansible_shell_executable to /bin/sh 10951 1726773088.59966: Set connection var ansible_connection to ssh 10951 1726773088.59983: variable 'ansible_shell_executable' from source: unknown 10951 1726773088.59989: variable 'ansible_connection' from source: unknown 10951 1726773088.59992: variable 'ansible_module_compression' from source: unknown 10951 1726773088.59995: variable 'ansible_shell_type' from source: unknown 10951 1726773088.59998: variable 'ansible_shell_executable' from source: unknown 10951 1726773088.60004: variable 'ansible_host' from source: host vars for 'managed_node2' 10951 1726773088.60008: variable 'ansible_pipelining' from source: unknown 10951 1726773088.60012: variable 'ansible_timeout' from source: unknown 10951 1726773088.60016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10951 1726773088.60146: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10951 1726773088.60157: variable 'omit' from source: magic vars 10951 1726773088.60163: starting attempt loop 10951 1726773088.60167: running the handler 10951 1726773088.60178: _low_level_execute_command(): starting 10951 1726773088.60188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10951 1726773088.62539: stdout chunk (state=2): >>>/root <<< 10951 1726773088.62657: stderr chunk (state=3): >>><<< 10951 1726773088.62665: stdout chunk (state=3): >>><<< 10951 1726773088.62682: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10951 1726773088.62696: _low_level_execute_command(): starting 10951 1726773088.62702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226 `" && echo ansible-tmp-1726773088.6269162-10951-279161918795226="` echo /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226 `" ) && sleep 0' 10951 1726773088.65331: stdout chunk (state=2): >>>ansible-tmp-1726773088.6269162-10951-279161918795226=/root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226 <<< 10951 1726773088.65463: stderr chunk (state=3): >>><<< 10951 1726773088.65469: stdout chunk (state=3): >>><<< 10951 1726773088.65484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773088.6269162-10951-279161918795226=/root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226 , stderr= 10951 1726773088.65521: variable 'ansible_module_compression' from source: unknown 10951 1726773088.65555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 10951 1726773088.65589: variable 'ansible_facts' from source: unknown 10951 1726773088.65652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/AnsiballZ_kernel_settings_get_config.py 10951 1726773088.65754: Sending initial data 10951 1726773088.65761: Sent initial data (174 bytes) 10951 1726773088.68253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmphvvpzlss /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/AnsiballZ_kernel_settings_get_config.py <<< 10951 1726773088.69344: stderr chunk (state=3): >>><<< 10951 1726773088.69352: stdout chunk (state=3): >>><<< 10951 1726773088.69370: done transferring module to remote 10951 1726773088.69381: _low_level_execute_command(): starting 10951 1726773088.69387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/ /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10951 1726773088.71715: stderr chunk (state=2): >>><<< 10951 1726773088.71723: stdout chunk (state=2): >>><<< 10951 1726773088.71737: _low_level_execute_command() done: rc=0, stdout=, stderr= 10951 1726773088.71741: _low_level_execute_command(): starting 10951 1726773088.71746: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/AnsiballZ_kernel_settings_get_config.py && sleep 0' 10951 1726773088.87274: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 10951 1726773088.88344: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10951 1726773088.88392: stderr chunk (state=3): >>><<< 10951 1726773088.88399: stdout chunk (state=3): >>><<< 10951 1726773088.88419: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 10951 1726773088.88445: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10951 1726773088.88456: _low_level_execute_command(): starting 10951 1726773088.88462: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773088.6269162-10951-279161918795226/ > /dev/null 2>&1 && sleep 0' 10951 1726773088.90911: stderr chunk (state=2): >>><<< 10951 1726773088.90920: stdout chunk (state=2): >>><<< 10951 1726773088.90935: _low_level_execute_command() done: rc=0, stdout=, stderr= 10951 1726773088.90942: handler run complete 10951 1726773088.90958: attempt loop complete, returning result 10951 1726773088.90961: _execute() done 10951 1726773088.90964: dumping result to json 10951 1726773088.90969: done dumping result, returning 10951 1726773088.90976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-885f-bbcf-00000000061e] 10951 1726773088.90982: sending task result for task 0affffe7-6841-885f-bbcf-00000000061e 10951 1726773088.91017: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061e 10951 1726773088.91020: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "65000", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8240 1726773088.91256: no more pending results, returning what we have 8240 1726773088.91259: results queue empty 8240 1726773088.91259: checking for any_errors_fatal 8240 1726773088.91264: done checking for any_errors_fatal 8240 1726773088.91265: checking for max_fail_percentage 8240 1726773088.91266: done checking for max_fail_percentage 8240 1726773088.91266: checking to see if all hosts have failed and the running result is not ok 8240 1726773088.91267: done checking to see if all hosts have failed 8240 1726773088.91267: getting the remaining hosts for this loop 8240 1726773088.91268: done getting the remaining hosts for this loop 8240 1726773088.91271: getting the next task for host managed_node2 8240 1726773088.91276: done getting next task for host managed_node2 8240 1726773088.91278: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8240 1726773088.91280: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773088.91290: getting variables 8240 1726773088.91291: in VariableManager get_vars() 8240 1726773088.91320: Calling all_inventory to load vars for managed_node2 8240 1726773088.91322: Calling groups_inventory to load vars for managed_node2 8240 1726773088.91324: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773088.91331: Calling all_plugins_play to load vars for managed_node2 8240 1726773088.91333: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773088.91335: Calling groups_plugins_play to load vars for managed_node2 8240 1726773088.91480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773088.91604: done with get_vars() 8240 1726773088.91611: done getting variables 8240 1726773088.91655: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:28 -0400 (0:00:00.331) 0:01:07.560 **** 8240 1726773088.91678: entering _queue_task() for managed_node2/template 8240 1726773088.91851: worker is 1 (out of 1 available) 8240 1726773088.91865: exiting _queue_task() for managed_node2/template 8240 1726773088.91879: done queuing things up, now waiting for results queue to drain 8240 1726773088.91880: waiting for pending results... 10959 1726773088.92011: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 10959 1726773088.92122: in run() - task 0affffe7-6841-885f-bbcf-00000000061f 10959 1726773088.92139: variable 'ansible_search_path' from source: unknown 10959 1726773088.92143: variable 'ansible_search_path' from source: unknown 10959 1726773088.92170: calling self._execute() 10959 1726773088.92245: variable 'ansible_host' from source: host vars for 'managed_node2' 10959 1726773088.92253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10959 1726773088.92262: variable 'omit' from source: magic vars 10959 1726773088.92342: variable 'omit' from source: magic vars 10959 1726773088.92379: variable 'omit' from source: magic vars 10959 1726773088.92619: variable '__kernel_settings_profile_src' from source: role '' all vars 10959 1726773088.92628: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10959 1726773088.92686: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10959 1726773088.92708: variable '__kernel_settings_profile_filename' from source: role '' all vars 10959 1726773088.92754: variable '__kernel_settings_profile_filename' from source: role '' all vars 10959 1726773088.92804: variable '__kernel_settings_profile_dir' from source: role '' all vars 10959 1726773088.92860: variable '__kernel_settings_profile_parent' from source: set_fact 10959 1726773088.92867: variable '__kernel_settings_tuned_profile' from source: role '' all vars 10959 1726773088.92895: variable 'omit' from source: magic vars 10959 1726773088.92930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10959 1726773088.92955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10959 1726773088.92976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10959 1726773088.92992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10959 1726773088.93007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10959 1726773088.93030: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10959 1726773088.93036: variable 'ansible_host' from source: host vars for 'managed_node2' 10959 1726773088.93040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10959 1726773088.93116: Set connection var ansible_pipelining to False 10959 1726773088.93123: Set connection var ansible_timeout to 10 10959 1726773088.93131: Set connection var ansible_module_compression to ZIP_DEFLATED 10959 1726773088.93134: Set connection var ansible_shell_type to sh 10959 1726773088.93139: Set connection var ansible_shell_executable to /bin/sh 10959 1726773088.93144: Set connection var ansible_connection to ssh 10959 1726773088.93159: variable 'ansible_shell_executable' from source: unknown 10959 1726773088.93163: variable 'ansible_connection' from source: unknown 10959 1726773088.93167: variable 'ansible_module_compression' from source: unknown 10959 1726773088.93170: variable 'ansible_shell_type' from source: unknown 10959 1726773088.93174: variable 'ansible_shell_executable' from source: unknown 10959 1726773088.93178: variable 'ansible_host' from source: host vars for 'managed_node2' 10959 1726773088.93182: variable 'ansible_pipelining' from source: unknown 10959 1726773088.93187: variable 'ansible_timeout' from source: unknown 10959 1726773088.93191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10959 1726773088.93281: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10959 1726773088.93295: variable 'omit' from source: magic vars 10959 1726773088.93303: starting attempt loop 10959 1726773088.93308: running the handler 10959 1726773088.93318: _low_level_execute_command(): starting 10959 1726773088.93327: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10959 1726773088.95647: stdout chunk (state=2): >>>/root <<< 10959 1726773088.95762: stderr chunk (state=3): >>><<< 10959 1726773088.95769: stdout chunk (state=3): >>><<< 10959 1726773088.95790: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10959 1726773088.95804: _low_level_execute_command(): starting 10959 1726773088.95810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573 `" && echo ansible-tmp-1726773088.957985-10959-240003735076573="` echo /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573 `" ) && sleep 0' 10959 1726773088.98397: stdout chunk (state=2): >>>ansible-tmp-1726773088.957985-10959-240003735076573=/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573 <<< 10959 1726773088.98528: stderr chunk (state=3): >>><<< 10959 1726773088.98534: stdout chunk (state=3): >>><<< 10959 1726773088.98548: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773088.957985-10959-240003735076573=/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573 , stderr= 10959 1726773088.98563: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 10959 1726773088.98581: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 10959 1726773088.98603: variable 'ansible_search_path' from source: unknown 10959 1726773088.99166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10959 1726773089.00810: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10959 1726773089.00857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10959 1726773089.00887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10959 1726773089.00918: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10959 1726773089.00938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10959 1726773089.01138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10959 1726773089.01159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10959 1726773089.01181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10959 1726773089.01214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10959 1726773089.01227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10959 1726773089.01453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10959 1726773089.01472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10959 1726773089.01491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10959 1726773089.01519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10959 1726773089.01532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10959 1726773089.01768: variable 'ansible_managed' from source: unknown 10959 1726773089.01775: variable '__sections' from source: task vars 10959 1726773089.01863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10959 1726773089.01883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10959 1726773089.01907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10959 1726773089.01933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10959 1726773089.01944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10959 1726773089.02016: variable 'kernel_settings_sysctl' from source: include params 10959 1726773089.02026: variable '__kernel_settings_state_empty' from source: role '' all vars 10959 1726773089.02033: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10959 1726773089.02070: variable '__sysctl_old' from source: task vars 10959 1726773089.02119: variable '__sysctl_old' from source: task vars 10959 1726773089.02257: variable 'kernel_settings_purge' from source: role '' defaults 10959 1726773089.02264: variable 'kernel_settings_sysctl' from source: include params 10959 1726773089.02271: variable '__kernel_settings_state_empty' from source: role '' all vars 10959 1726773089.02276: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10959 1726773089.02281: variable '__kernel_settings_profile_contents' from source: set_fact 10959 1726773089.02421: variable 'kernel_settings_sysfs' from source: include params 10959 1726773089.02430: variable '__kernel_settings_state_empty' from source: role '' all vars 10959 1726773089.02436: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10959 1726773089.02453: variable '__sysfs_old' from source: task vars 10959 1726773089.02496: variable '__sysfs_old' from source: task vars 10959 1726773089.02639: variable 'kernel_settings_purge' from source: role '' defaults 10959 1726773089.02645: variable 'kernel_settings_sysfs' from source: include params 10959 1726773089.02652: variable '__kernel_settings_state_empty' from source: role '' all vars 10959 1726773089.02657: variable '__kernel_settings_previous_replaced' from source: role '' all vars 10959 1726773089.02662: variable '__kernel_settings_profile_contents' from source: set_fact 10959 1726773089.02696: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10959 1726773089.02707: variable '__systemd_old' from source: task vars 10959 1726773089.02750: variable '__systemd_old' from source: task vars 10959 1726773089.02880: variable 'kernel_settings_purge' from source: role '' defaults 10959 1726773089.02888: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 10959 1726773089.02894: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.02900: variable '__kernel_settings_profile_contents' from source: set_fact 10959 1726773089.02914: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10959 1726773089.02919: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10959 1726773089.02924: variable '__trans_huge_old' from source: task vars 10959 1726773089.02966: variable '__trans_huge_old' from source: task vars 10959 1726773089.03096: variable 'kernel_settings_purge' from source: role '' defaults 10959 1726773089.03104: variable 'kernel_settings_transparent_hugepages' from source: role '' defaults 10959 1726773089.03110: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03115: variable '__kernel_settings_profile_contents' from source: set_fact 10959 1726773089.03126: variable '__trans_defrag_old' from source: task vars 10959 1726773089.03167: variable '__trans_defrag_old' from source: task vars 10959 1726773089.03299: variable 'kernel_settings_purge' from source: role '' defaults 10959 1726773089.03308: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 10959 1726773089.03313: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03319: variable '__kernel_settings_profile_contents' from source: set_fact 10959 1726773089.03333: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03348: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03360: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03368: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03374: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03392: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03403: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03412: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03420: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03427: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03433: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03438: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03444: variable '__kernel_settings_state_absent' from source: role '' all vars 10959 1726773089.03906: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10959 1726773089.03947: variable 'ansible_module_compression' from source: unknown 10959 1726773089.03989: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10959 1726773089.04014: variable 'ansible_facts' from source: unknown 10959 1726773089.04079: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_stat.py 10959 1726773089.04173: Sending initial data 10959 1726773089.04180: Sent initial data (151 bytes) 10959 1726773089.06790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpxn0evmw_ /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_stat.py <<< 10959 1726773089.07894: stderr chunk (state=3): >>><<< 10959 1726773089.07905: stdout chunk (state=3): >>><<< 10959 1726773089.07925: done transferring module to remote 10959 1726773089.07936: _low_level_execute_command(): starting 10959 1726773089.07941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/ /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_stat.py && sleep 0' 10959 1726773089.10327: stderr chunk (state=2): >>><<< 10959 1726773089.10336: stdout chunk (state=2): >>><<< 10959 1726773089.10351: _low_level_execute_command() done: rc=0, stdout=, stderr= 10959 1726773089.10355: _low_level_execute_command(): starting 10959 1726773089.10360: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_stat.py && sleep 0' 10959 1726773089.27330: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726773079.85493, "mtime": 1726773079.0469215, "ctime": 1726773079.296924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "674745151", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 10959 1726773089.28497: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10959 1726773089.28549: stderr chunk (state=3): >>><<< 10959 1726773089.28556: stdout chunk (state=3): >>><<< 10959 1726773089.28572: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 351, "inode": 90177794, "dev": 51713, "nlink": 1, "atime": 1726773079.85493, "mtime": 1726773079.0469215, "ctime": 1726773079.296924, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "221aa34fef95c2fe05408be9921820449785a5b2", "mimetype": "text/plain", "charset": "us-ascii", "version": "674745151", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 10959 1726773089.28613: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10959 1726773089.28704: Sending initial data 10959 1726773089.28712: Sent initial data (159 bytes) 10959 1726773089.31295: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpxqb7jz_5/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source <<< 10959 1726773089.31661: stderr chunk (state=3): >>><<< 10959 1726773089.31668: stdout chunk (state=3): >>><<< 10959 1726773089.31682: _low_level_execute_command(): starting 10959 1726773089.31689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/ /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source && sleep 0' 10959 1726773089.33999: stderr chunk (state=2): >>><<< 10959 1726773089.34012: stdout chunk (state=2): >>><<< 10959 1726773089.34027: _low_level_execute_command() done: rc=0, stdout=, stderr= 10959 1726773089.34048: variable 'ansible_module_compression' from source: unknown 10959 1726773089.34081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 10959 1726773089.34102: variable 'ansible_facts' from source: unknown 10959 1726773089.34160: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_copy.py 10959 1726773089.34252: Sending initial data 10959 1726773089.34259: Sent initial data (151 bytes) 10959 1726773089.36759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpqfufqyye /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_copy.py <<< 10959 1726773089.37894: stderr chunk (state=3): >>><<< 10959 1726773089.37904: stdout chunk (state=3): >>><<< 10959 1726773089.37924: done transferring module to remote 10959 1726773089.37933: _low_level_execute_command(): starting 10959 1726773089.37938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/ /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_copy.py && sleep 0' 10959 1726773089.40308: stderr chunk (state=2): >>><<< 10959 1726773089.40316: stdout chunk (state=2): >>><<< 10959 1726773089.40330: _low_level_execute_command() done: rc=0, stdout=, stderr= 10959 1726773089.40335: _low_level_execute_command(): starting 10959 1726773089.40341: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/AnsiballZ_copy.py && sleep 0' 10959 1726773089.56623: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 10959 1726773089.57768: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10959 1726773089.57820: stderr chunk (state=3): >>><<< 10959 1726773089.57827: stdout chunk (state=3): >>><<< 10959 1726773089.57843: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source", "md5sum": "1fd7f2202613b516022cf613601e26bd", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10959 1726773089.57869: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '3107bf46f5c007ef178305bb243dd11664f9bf35', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10959 1726773089.57901: _low_level_execute_command(): starting 10959 1726773089.57908: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/ > /dev/null 2>&1 && sleep 0' 10959 1726773089.60325: stderr chunk (state=2): >>><<< 10959 1726773089.60333: stdout chunk (state=2): >>><<< 10959 1726773089.60347: _low_level_execute_command() done: rc=0, stdout=, stderr= 10959 1726773089.60356: handler run complete 10959 1726773089.60378: attempt loop complete, returning result 10959 1726773089.60382: _execute() done 10959 1726773089.60386: dumping result to json 10959 1726773089.60392: done dumping result, returning 10959 1726773089.60402: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-885f-bbcf-00000000061f] 10959 1726773089.60408: sending task result for task 0affffe7-6841-885f-bbcf-00000000061f 10959 1726773089.60452: done sending task result for task 0affffe7-6841-885f-bbcf-00000000061f 10959 1726773089.60456: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "1fd7f2202613b516022cf613601e26bd", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 372, "src": "/root/.ansible/tmp/ansible-tmp-1726773088.957985-10959-240003735076573/source", "state": "file", "uid": 0 } 8240 1726773089.60634: no more pending results, returning what we have 8240 1726773089.60638: results queue empty 8240 1726773089.60638: checking for any_errors_fatal 8240 1726773089.60645: done checking for any_errors_fatal 8240 1726773089.60646: checking for max_fail_percentage 8240 1726773089.60647: done checking for max_fail_percentage 8240 1726773089.60648: checking to see if all hosts have failed and the running result is not ok 8240 1726773089.60648: done checking to see if all hosts have failed 8240 1726773089.60649: getting the remaining hosts for this loop 8240 1726773089.60650: done getting the remaining hosts for this loop 8240 1726773089.60653: getting the next task for host managed_node2 8240 1726773089.60659: done getting next task for host managed_node2 8240 1726773089.60662: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8240 1726773089.60664: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773089.60674: getting variables 8240 1726773089.60676: in VariableManager get_vars() 8240 1726773089.60711: Calling all_inventory to load vars for managed_node2 8240 1726773089.60714: Calling groups_inventory to load vars for managed_node2 8240 1726773089.60715: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773089.60725: Calling all_plugins_play to load vars for managed_node2 8240 1726773089.60728: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773089.60730: Calling groups_plugins_play to load vars for managed_node2 8240 1726773089.60842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773089.60961: done with get_vars() 8240 1726773089.60970: done getting variables 8240 1726773089.61017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.693) 0:01:08.254 **** 8240 1726773089.61040: entering _queue_task() for managed_node2/service 8240 1726773089.61209: worker is 1 (out of 1 available) 8240 1726773089.61223: exiting _queue_task() for managed_node2/service 8240 1726773089.61238: done queuing things up, now waiting for results queue to drain 8240 1726773089.61240: waiting for pending results... 10977 1726773089.61366: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 10977 1726773089.61489: in run() - task 0affffe7-6841-885f-bbcf-000000000620 10977 1726773089.61507: variable 'ansible_search_path' from source: unknown 10977 1726773089.61511: variable 'ansible_search_path' from source: unknown 10977 1726773089.61546: variable '__kernel_settings_services' from source: include_vars 10977 1726773089.61783: variable '__kernel_settings_services' from source: include_vars 10977 1726773089.62132: variable 'omit' from source: magic vars 10977 1726773089.62221: variable 'ansible_host' from source: host vars for 'managed_node2' 10977 1726773089.62232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10977 1726773089.62241: variable 'omit' from source: magic vars 10977 1726773089.62417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10977 1726773089.62581: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10977 1726773089.62616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10977 1726773089.62638: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10977 1726773089.62659: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10977 1726773089.62732: variable '__kernel_settings_register_profile' from source: set_fact 10977 1726773089.62743: variable '__kernel_settings_register_mode' from source: set_fact 10977 1726773089.62758: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 10977 1726773089.62763: when evaluation is False, skipping this task 10977 1726773089.62784: variable 'item' from source: unknown 10977 1726773089.62832: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 10977 1726773089.62860: dumping result to json 10977 1726773089.62866: done dumping result, returning 10977 1726773089.62872: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-885f-bbcf-000000000620] 10977 1726773089.62877: sending task result for task 0affffe7-6841-885f-bbcf-000000000620 10977 1726773089.62904: done sending task result for task 0affffe7-6841-885f-bbcf-000000000620 10977 1726773089.62908: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8240 1726773089.63059: no more pending results, returning what we have 8240 1726773089.63063: results queue empty 8240 1726773089.63063: checking for any_errors_fatal 8240 1726773089.63072: done checking for any_errors_fatal 8240 1726773089.63073: checking for max_fail_percentage 8240 1726773089.63074: done checking for max_fail_percentage 8240 1726773089.63074: checking to see if all hosts have failed and the running result is not ok 8240 1726773089.63075: done checking to see if all hosts have failed 8240 1726773089.63076: getting the remaining hosts for this loop 8240 1726773089.63077: done getting the remaining hosts for this loop 8240 1726773089.63080: getting the next task for host managed_node2 8240 1726773089.63088: done getting next task for host managed_node2 8240 1726773089.63091: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8240 1726773089.63093: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773089.63107: getting variables 8240 1726773089.63109: in VariableManager get_vars() 8240 1726773089.63133: Calling all_inventory to load vars for managed_node2 8240 1726773089.63135: Calling groups_inventory to load vars for managed_node2 8240 1726773089.63136: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773089.63146: Calling all_plugins_play to load vars for managed_node2 8240 1726773089.63148: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773089.63150: Calling groups_plugins_play to load vars for managed_node2 8240 1726773089.63252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773089.63369: done with get_vars() 8240 1726773089.63376: done getting variables 8240 1726773089.63418: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:29 -0400 (0:00:00.023) 0:01:08.278 **** 8240 1726773089.63441: entering _queue_task() for managed_node2/command 8240 1726773089.63597: worker is 1 (out of 1 available) 8240 1726773089.63611: exiting _queue_task() for managed_node2/command 8240 1726773089.63624: done queuing things up, now waiting for results queue to drain 8240 1726773089.63625: waiting for pending results... 10978 1726773089.63750: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 10978 1726773089.63863: in run() - task 0affffe7-6841-885f-bbcf-000000000621 10978 1726773089.63878: variable 'ansible_search_path' from source: unknown 10978 1726773089.63882: variable 'ansible_search_path' from source: unknown 10978 1726773089.63912: calling self._execute() 10978 1726773089.63981: variable 'ansible_host' from source: host vars for 'managed_node2' 10978 1726773089.63989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10978 1726773089.63996: variable 'omit' from source: magic vars 10978 1726773089.64325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10978 1726773089.64794: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10978 1726773089.64830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10978 1726773089.64855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10978 1726773089.64878: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10978 1726773089.64960: variable '__kernel_settings_register_profile' from source: set_fact 10978 1726773089.64983: Evaluated conditional (not __kernel_settings_register_profile is changed): True 10978 1726773089.65074: variable '__kernel_settings_register_mode' from source: set_fact 10978 1726773089.65089: Evaluated conditional (not __kernel_settings_register_mode is changed): True 10978 1726773089.65164: variable '__kernel_settings_register_apply' from source: set_fact 10978 1726773089.65175: Evaluated conditional (__kernel_settings_register_apply is changed): True 10978 1726773089.65183: variable 'omit' from source: magic vars 10978 1726773089.65217: variable 'omit' from source: magic vars 10978 1726773089.65303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10978 1726773089.66732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10978 1726773089.66791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10978 1726773089.66822: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10978 1726773089.66848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10978 1726773089.66868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10978 1726773089.66927: variable '__kernel_settings_active_profile' from source: set_fact 10978 1726773089.66957: variable 'omit' from source: magic vars 10978 1726773089.66981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10978 1726773089.67008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10978 1726773089.67024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10978 1726773089.67038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10978 1726773089.67048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10978 1726773089.67072: variable 'inventory_hostname' from source: host vars for 'managed_node2' 10978 1726773089.67077: variable 'ansible_host' from source: host vars for 'managed_node2' 10978 1726773089.67082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10978 1726773089.67153: Set connection var ansible_pipelining to False 10978 1726773089.67161: Set connection var ansible_timeout to 10 10978 1726773089.67169: Set connection var ansible_module_compression to ZIP_DEFLATED 10978 1726773089.67174: Set connection var ansible_shell_type to sh 10978 1726773089.67180: Set connection var ansible_shell_executable to /bin/sh 10978 1726773089.67186: Set connection var ansible_connection to ssh 10978 1726773089.67207: variable 'ansible_shell_executable' from source: unknown 10978 1726773089.67211: variable 'ansible_connection' from source: unknown 10978 1726773089.67215: variable 'ansible_module_compression' from source: unknown 10978 1726773089.67218: variable 'ansible_shell_type' from source: unknown 10978 1726773089.67221: variable 'ansible_shell_executable' from source: unknown 10978 1726773089.67225: variable 'ansible_host' from source: host vars for 'managed_node2' 10978 1726773089.67229: variable 'ansible_pipelining' from source: unknown 10978 1726773089.67232: variable 'ansible_timeout' from source: unknown 10978 1726773089.67237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 10978 1726773089.67306: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10978 1726773089.67318: variable 'omit' from source: magic vars 10978 1726773089.67324: starting attempt loop 10978 1726773089.67327: running the handler 10978 1726773089.67339: _low_level_execute_command(): starting 10978 1726773089.67345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10978 1726773089.69675: stdout chunk (state=2): >>>/root <<< 10978 1726773089.69803: stderr chunk (state=3): >>><<< 10978 1726773089.69810: stdout chunk (state=3): >>><<< 10978 1726773089.69830: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 10978 1726773089.69842: _low_level_execute_command(): starting 10978 1726773089.69848: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076 `" && echo ansible-tmp-1726773089.6983767-10978-128684468236076="` echo /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076 `" ) && sleep 0' 10978 1726773089.72446: stdout chunk (state=2): >>>ansible-tmp-1726773089.6983767-10978-128684468236076=/root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076 <<< 10978 1726773089.72576: stderr chunk (state=3): >>><<< 10978 1726773089.72584: stdout chunk (state=3): >>><<< 10978 1726773089.72602: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773089.6983767-10978-128684468236076=/root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076 , stderr= 10978 1726773089.72627: variable 'ansible_module_compression' from source: unknown 10978 1726773089.72665: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10978 1726773089.72696: variable 'ansible_facts' from source: unknown 10978 1726773089.72766: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/AnsiballZ_command.py 10978 1726773089.72868: Sending initial data 10978 1726773089.72875: Sent initial data (155 bytes) 10978 1726773089.76416: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp_czzii05 /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/AnsiballZ_command.py <<< 10978 1726773089.77912: stderr chunk (state=3): >>><<< 10978 1726773089.77924: stdout chunk (state=3): >>><<< 10978 1726773089.77949: done transferring module to remote 10978 1726773089.77963: _low_level_execute_command(): starting 10978 1726773089.77968: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/ /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/AnsiballZ_command.py && sleep 0' 10978 1726773089.80613: stderr chunk (state=2): >>><<< 10978 1726773089.80623: stdout chunk (state=2): >>><<< 10978 1726773089.80638: _low_level_execute_command() done: rc=0, stdout=, stderr= 10978 1726773089.80643: _low_level_execute_command(): starting 10978 1726773089.80648: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/AnsiballZ_command.py && sleep 0' 10978 1726773091.13381: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:29.955870", "end": "2024-09-19 15:11:31.131605", "delta": "0:00:01.175735", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10978 1726773091.14663: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 10978 1726773091.14711: stderr chunk (state=3): >>><<< 10978 1726773091.14718: stdout chunk (state=3): >>><<< 10978 1726773091.14734: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:29.955870", "end": "2024-09-19 15:11:31.131605", "delta": "0:00:01.175735", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 10978 1726773091.14761: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 10978 1726773091.14770: _low_level_execute_command(): starting 10978 1726773091.14777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773089.6983767-10978-128684468236076/ > /dev/null 2>&1 && sleep 0' 10978 1726773091.17328: stderr chunk (state=2): >>><<< 10978 1726773091.17339: stdout chunk (state=2): >>><<< 10978 1726773091.17357: _low_level_execute_command() done: rc=0, stdout=, stderr= 10978 1726773091.17364: handler run complete 10978 1726773091.17382: Evaluated conditional (True): True 10978 1726773091.17392: attempt loop complete, returning result 10978 1726773091.17396: _execute() done 10978 1726773091.17399: dumping result to json 10978 1726773091.17406: done dumping result, returning 10978 1726773091.17414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-885f-bbcf-000000000621] 10978 1726773091.17420: sending task result for task 0affffe7-6841-885f-bbcf-000000000621 10978 1726773091.17448: done sending task result for task 0affffe7-6841-885f-bbcf-000000000621 10978 1726773091.17452: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.175735", "end": "2024-09-19 15:11:31.131605", "rc": 0, "start": "2024-09-19 15:11:29.955870" } 8240 1726773091.17597: no more pending results, returning what we have 8240 1726773091.17601: results queue empty 8240 1726773091.17601: checking for any_errors_fatal 8240 1726773091.17609: done checking for any_errors_fatal 8240 1726773091.17610: checking for max_fail_percentage 8240 1726773091.17612: done checking for max_fail_percentage 8240 1726773091.17612: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.17613: done checking to see if all hosts have failed 8240 1726773091.17614: getting the remaining hosts for this loop 8240 1726773091.17615: done getting the remaining hosts for this loop 8240 1726773091.17618: getting the next task for host managed_node2 8240 1726773091.17625: done getting next task for host managed_node2 8240 1726773091.17628: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8240 1726773091.17631: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.17642: getting variables 8240 1726773091.17644: in VariableManager get_vars() 8240 1726773091.17678: Calling all_inventory to load vars for managed_node2 8240 1726773091.17680: Calling groups_inventory to load vars for managed_node2 8240 1726773091.17682: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.17694: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.17697: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.17699: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.17819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.18154: done with get_vars() 8240 1726773091.18161: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:31 -0400 (0:00:01.547) 0:01:09.826 **** 8240 1726773091.18233: entering _queue_task() for managed_node2/include_tasks 8240 1726773091.18407: worker is 1 (out of 1 available) 8240 1726773091.18420: exiting _queue_task() for managed_node2/include_tasks 8240 1726773091.18433: done queuing things up, now waiting for results queue to drain 8240 1726773091.18434: waiting for pending results... 11029 1726773091.18569: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11029 1726773091.18698: in run() - task 0affffe7-6841-885f-bbcf-000000000622 11029 1726773091.18717: variable 'ansible_search_path' from source: unknown 11029 1726773091.18721: variable 'ansible_search_path' from source: unknown 11029 1726773091.18748: calling self._execute() 11029 1726773091.18822: variable 'ansible_host' from source: host vars for 'managed_node2' 11029 1726773091.18831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11029 1726773091.18841: variable 'omit' from source: magic vars 11029 1726773091.19179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11029 1726773091.19362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11029 1726773091.19399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11029 1726773091.19429: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11029 1726773091.19455: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11029 1726773091.19546: variable '__kernel_settings_register_apply' from source: set_fact 11029 1726773091.19569: Evaluated conditional (__kernel_settings_register_apply is changed): True 11029 1726773091.19578: _execute() done 11029 1726773091.19582: dumping result to json 11029 1726773091.19588: done dumping result, returning 11029 1726773091.19594: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-885f-bbcf-000000000622] 11029 1726773091.19599: sending task result for task 0affffe7-6841-885f-bbcf-000000000622 11029 1726773091.19626: done sending task result for task 0affffe7-6841-885f-bbcf-000000000622 11029 1726773091.19629: WORKER PROCESS EXITING 8240 1726773091.19738: no more pending results, returning what we have 8240 1726773091.19743: in VariableManager get_vars() 8240 1726773091.19779: Calling all_inventory to load vars for managed_node2 8240 1726773091.19782: Calling groups_inventory to load vars for managed_node2 8240 1726773091.19783: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.19795: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.19798: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.19800: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.19922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.20038: done with get_vars() 8240 1726773091.20043: variable 'ansible_search_path' from source: unknown 8240 1726773091.20043: variable 'ansible_search_path' from source: unknown 8240 1726773091.20067: we have included files to process 8240 1726773091.20068: generating all_blocks data 8240 1726773091.20069: done generating all_blocks data 8240 1726773091.20073: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773091.20074: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773091.20075: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8240 1726773091.20335: done processing included file 8240 1726773091.20338: iterating over new_blocks loaded from include file 8240 1726773091.20339: in VariableManager get_vars() 8240 1726773091.20354: done with get_vars() 8240 1726773091.20355: filtering new block on tags 8240 1726773091.20392: done filtering new block on tags 8240 1726773091.20394: done iterating over new_blocks loaded from include file 8240 1726773091.20394: extending task lists for all hosts with included blocks 8240 1726773091.20799: done extending task lists 8240 1726773091.20802: done processing included files 8240 1726773091.20803: results queue empty 8240 1726773091.20803: checking for any_errors_fatal 8240 1726773091.20807: done checking for any_errors_fatal 8240 1726773091.20807: checking for max_fail_percentage 8240 1726773091.20808: done checking for max_fail_percentage 8240 1726773091.20808: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.20809: done checking to see if all hosts have failed 8240 1726773091.20809: getting the remaining hosts for this loop 8240 1726773091.20810: done getting the remaining hosts for this loop 8240 1726773091.20813: getting the next task for host managed_node2 8240 1726773091.20817: done getting next task for host managed_node2 8240 1726773091.20819: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8240 1726773091.20821: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.20827: getting variables 8240 1726773091.20828: in VariableManager get_vars() 8240 1726773091.20836: Calling all_inventory to load vars for managed_node2 8240 1726773091.20838: Calling groups_inventory to load vars for managed_node2 8240 1726773091.20839: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.20842: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.20843: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.20845: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.20922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.21030: done with get_vars() 8240 1726773091.21037: done getting variables 8240 1726773091.21062: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.028) 0:01:09.854 **** 8240 1726773091.21083: entering _queue_task() for managed_node2/command 8240 1726773091.21258: worker is 1 (out of 1 available) 8240 1726773091.21271: exiting _queue_task() for managed_node2/command 8240 1726773091.21282: done queuing things up, now waiting for results queue to drain 8240 1726773091.21284: waiting for pending results... 11030 1726773091.21411: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11030 1726773091.21546: in run() - task 0affffe7-6841-885f-bbcf-0000000007f9 11030 1726773091.21561: variable 'ansible_search_path' from source: unknown 11030 1726773091.21565: variable 'ansible_search_path' from source: unknown 11030 1726773091.21593: calling self._execute() 11030 1726773091.21660: variable 'ansible_host' from source: host vars for 'managed_node2' 11030 1726773091.21669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11030 1726773091.21677: variable 'omit' from source: magic vars 11030 1726773091.21751: variable 'omit' from source: magic vars 11030 1726773091.21796: variable 'omit' from source: magic vars 11030 1726773091.21820: variable 'omit' from source: magic vars 11030 1726773091.21852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11030 1726773091.21879: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11030 1726773091.21901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11030 1726773091.21918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11030 1726773091.21930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11030 1726773091.21953: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11030 1726773091.21958: variable 'ansible_host' from source: host vars for 'managed_node2' 11030 1726773091.21960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11030 1726773091.22028: Set connection var ansible_pipelining to False 11030 1726773091.22033: Set connection var ansible_timeout to 10 11030 1726773091.22038: Set connection var ansible_module_compression to ZIP_DEFLATED 11030 1726773091.22040: Set connection var ansible_shell_type to sh 11030 1726773091.22043: Set connection var ansible_shell_executable to /bin/sh 11030 1726773091.22045: Set connection var ansible_connection to ssh 11030 1726773091.22060: variable 'ansible_shell_executable' from source: unknown 11030 1726773091.22063: variable 'ansible_connection' from source: unknown 11030 1726773091.22065: variable 'ansible_module_compression' from source: unknown 11030 1726773091.22067: variable 'ansible_shell_type' from source: unknown 11030 1726773091.22069: variable 'ansible_shell_executable' from source: unknown 11030 1726773091.22071: variable 'ansible_host' from source: host vars for 'managed_node2' 11030 1726773091.22073: variable 'ansible_pipelining' from source: unknown 11030 1726773091.22075: variable 'ansible_timeout' from source: unknown 11030 1726773091.22078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11030 1726773091.22166: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11030 1726773091.22176: variable 'omit' from source: magic vars 11030 1726773091.22180: starting attempt loop 11030 1726773091.22182: running the handler 11030 1726773091.22198: _low_level_execute_command(): starting 11030 1726773091.22206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11030 1726773091.24563: stdout chunk (state=2): >>>/root <<< 11030 1726773091.24683: stderr chunk (state=3): >>><<< 11030 1726773091.24690: stdout chunk (state=3): >>><<< 11030 1726773091.24709: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11030 1726773091.24722: _low_level_execute_command(): starting 11030 1726773091.24728: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016 `" && echo ansible-tmp-1726773091.2471673-11030-123612486553016="` echo /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016 `" ) && sleep 0' 11030 1726773091.27362: stdout chunk (state=2): >>>ansible-tmp-1726773091.2471673-11030-123612486553016=/root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016 <<< 11030 1726773091.27479: stderr chunk (state=3): >>><<< 11030 1726773091.27487: stdout chunk (state=3): >>><<< 11030 1726773091.27500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.2471673-11030-123612486553016=/root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016 , stderr= 11030 1726773091.27526: variable 'ansible_module_compression' from source: unknown 11030 1726773091.27569: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11030 1726773091.27601: variable 'ansible_facts' from source: unknown 11030 1726773091.27683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/AnsiballZ_command.py 11030 1726773091.28012: Sending initial data 11030 1726773091.28019: Sent initial data (155 bytes) 11030 1726773091.30525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpgbixts4x /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/AnsiballZ_command.py <<< 11030 1726773091.32994: stderr chunk (state=3): >>><<< 11030 1726773091.33004: stdout chunk (state=3): >>><<< 11030 1726773091.33029: done transferring module to remote 11030 1726773091.33043: _low_level_execute_command(): starting 11030 1726773091.33050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/ /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/AnsiballZ_command.py && sleep 0' 11030 1726773091.35913: stderr chunk (state=2): >>><<< 11030 1726773091.35923: stdout chunk (state=2): >>><<< 11030 1726773091.35939: _low_level_execute_command() done: rc=0, stdout=, stderr= 11030 1726773091.35944: _low_level_execute_command(): starting 11030 1726773091.35949: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/AnsiballZ_command.py && sleep 0' 11030 1726773091.62265: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:31.510968", "end": "2024-09-19 15:11:31.615096", "delta": "0:00:00.104128", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11030 1726773091.62939: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11030 1726773091.62987: stderr chunk (state=3): >>><<< 11030 1726773091.62994: stdout chunk (state=3): >>><<< 11030 1726773091.63011: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:31.510968", "end": "2024-09-19 15:11:31.615096", "delta": "0:00:00.104128", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11030 1726773091.63054: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11030 1726773091.63064: _low_level_execute_command(): starting 11030 1726773091.63071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.2471673-11030-123612486553016/ > /dev/null 2>&1 && sleep 0' 11030 1726773091.65541: stderr chunk (state=2): >>><<< 11030 1726773091.65549: stdout chunk (state=2): >>><<< 11030 1726773091.65564: _low_level_execute_command() done: rc=0, stdout=, stderr= 11030 1726773091.65572: handler run complete 11030 1726773091.65597: Evaluated conditional (False): False 11030 1726773091.65612: attempt loop complete, returning result 11030 1726773091.65616: _execute() done 11030 1726773091.65619: dumping result to json 11030 1726773091.65625: done dumping result, returning 11030 1726773091.65634: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-885f-bbcf-0000000007f9] 11030 1726773091.65644: sending task result for task 0affffe7-6841-885f-bbcf-0000000007f9 11030 1726773091.65684: done sending task result for task 0affffe7-6841-885f-bbcf-0000000007f9 11030 1726773091.65690: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.104128", "end": "2024-09-19 15:11:31.615096", "rc": 0, "start": "2024-09-19 15:11:31.510968" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773091.66137: no more pending results, returning what we have 8240 1726773091.66141: results queue empty 8240 1726773091.66142: checking for any_errors_fatal 8240 1726773091.66144: done checking for any_errors_fatal 8240 1726773091.66144: checking for max_fail_percentage 8240 1726773091.66146: done checking for max_fail_percentage 8240 1726773091.66147: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.66148: done checking to see if all hosts have failed 8240 1726773091.66148: getting the remaining hosts for this loop 8240 1726773091.66149: done getting the remaining hosts for this loop 8240 1726773091.66153: getting the next task for host managed_node2 8240 1726773091.66160: done getting next task for host managed_node2 8240 1726773091.66163: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8240 1726773091.66166: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.66176: getting variables 8240 1726773091.66178: in VariableManager get_vars() 8240 1726773091.66218: Calling all_inventory to load vars for managed_node2 8240 1726773091.66222: Calling groups_inventory to load vars for managed_node2 8240 1726773091.66224: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.66234: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.66237: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.66240: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.66424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.66624: done with get_vars() 8240 1726773091.66632: done getting variables 8240 1726773091.66673: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.456) 0:01:10.310 **** 8240 1726773091.66712: entering _queue_task() for managed_node2/shell 8240 1726773091.66878: worker is 1 (out of 1 available) 8240 1726773091.66894: exiting _queue_task() for managed_node2/shell 8240 1726773091.66909: done queuing things up, now waiting for results queue to drain 8240 1726773091.66911: waiting for pending results... 11048 1726773091.67035: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11048 1726773091.67169: in run() - task 0affffe7-6841-885f-bbcf-0000000007fa 11048 1726773091.67187: variable 'ansible_search_path' from source: unknown 11048 1726773091.67191: variable 'ansible_search_path' from source: unknown 11048 1726773091.67219: calling self._execute() 11048 1726773091.67291: variable 'ansible_host' from source: host vars for 'managed_node2' 11048 1726773091.67299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11048 1726773091.67307: variable 'omit' from source: magic vars 11048 1726773091.67635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11048 1726773091.67815: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11048 1726773091.67849: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11048 1726773091.67876: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11048 1726773091.67906: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11048 1726773091.67992: variable '__kernel_settings_register_verify_values' from source: set_fact 11048 1726773091.68014: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11048 1726773091.68020: when evaluation is False, skipping this task 11048 1726773091.68024: _execute() done 11048 1726773091.68028: dumping result to json 11048 1726773091.68031: done dumping result, returning 11048 1726773091.68037: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-885f-bbcf-0000000007fa] 11048 1726773091.68043: sending task result for task 0affffe7-6841-885f-bbcf-0000000007fa 11048 1726773091.68065: done sending task result for task 0affffe7-6841-885f-bbcf-0000000007fa 11048 1726773091.68068: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773091.68174: no more pending results, returning what we have 8240 1726773091.68177: results queue empty 8240 1726773091.68178: checking for any_errors_fatal 8240 1726773091.68189: done checking for any_errors_fatal 8240 1726773091.68190: checking for max_fail_percentage 8240 1726773091.68191: done checking for max_fail_percentage 8240 1726773091.68192: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.68193: done checking to see if all hosts have failed 8240 1726773091.68193: getting the remaining hosts for this loop 8240 1726773091.68194: done getting the remaining hosts for this loop 8240 1726773091.68198: getting the next task for host managed_node2 8240 1726773091.68207: done getting next task for host managed_node2 8240 1726773091.68210: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8240 1726773091.68214: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.68228: getting variables 8240 1726773091.68229: in VariableManager get_vars() 8240 1726773091.68261: Calling all_inventory to load vars for managed_node2 8240 1726773091.68263: Calling groups_inventory to load vars for managed_node2 8240 1726773091.68265: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.68272: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.68274: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.68276: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.68389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.68547: done with get_vars() 8240 1726773091.68556: done getting variables 8240 1726773091.68612: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.019) 0:01:10.330 **** 8240 1726773091.68636: entering _queue_task() for managed_node2/fail 8240 1726773091.68819: worker is 1 (out of 1 available) 8240 1726773091.68832: exiting _queue_task() for managed_node2/fail 8240 1726773091.68843: done queuing things up, now waiting for results queue to drain 8240 1726773091.68845: waiting for pending results... 11049 1726773091.68959: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11049 1726773091.69076: in run() - task 0affffe7-6841-885f-bbcf-0000000007fb 11049 1726773091.69095: variable 'ansible_search_path' from source: unknown 11049 1726773091.69099: variable 'ansible_search_path' from source: unknown 11049 1726773091.69126: calling self._execute() 11049 1726773091.69197: variable 'ansible_host' from source: host vars for 'managed_node2' 11049 1726773091.69206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11049 1726773091.69215: variable 'omit' from source: magic vars 11049 1726773091.69534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11049 1726773091.69773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11049 1726773091.69815: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11049 1726773091.69845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11049 1726773091.69873: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11049 1726773091.69967: variable '__kernel_settings_register_verify_values' from source: set_fact 11049 1726773091.69993: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11049 1726773091.69998: when evaluation is False, skipping this task 11049 1726773091.70001: _execute() done 11049 1726773091.70005: dumping result to json 11049 1726773091.70007: done dumping result, returning 11049 1726773091.70010: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-885f-bbcf-0000000007fb] 11049 1726773091.70014: sending task result for task 0affffe7-6841-885f-bbcf-0000000007fb 11049 1726773091.70037: done sending task result for task 0affffe7-6841-885f-bbcf-0000000007fb 11049 1726773091.70039: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773091.70627: no more pending results, returning what we have 8240 1726773091.70630: results queue empty 8240 1726773091.70631: checking for any_errors_fatal 8240 1726773091.70636: done checking for any_errors_fatal 8240 1726773091.70637: checking for max_fail_percentage 8240 1726773091.70638: done checking for max_fail_percentage 8240 1726773091.70639: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.70640: done checking to see if all hosts have failed 8240 1726773091.70641: getting the remaining hosts for this loop 8240 1726773091.70642: done getting the remaining hosts for this loop 8240 1726773091.70645: getting the next task for host managed_node2 8240 1726773091.70653: done getting next task for host managed_node2 8240 1726773091.70657: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8240 1726773091.70660: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.70676: getting variables 8240 1726773091.70678: in VariableManager get_vars() 8240 1726773091.70707: Calling all_inventory to load vars for managed_node2 8240 1726773091.70709: Calling groups_inventory to load vars for managed_node2 8240 1726773091.70710: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.70718: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.70720: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.70723: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.70842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.71003: done with get_vars() 8240 1726773091.71010: done getting variables 8240 1726773091.71051: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.024) 0:01:10.354 **** 8240 1726773091.71072: entering _queue_task() for managed_node2/set_fact 8240 1726773091.71231: worker is 1 (out of 1 available) 8240 1726773091.71245: exiting _queue_task() for managed_node2/set_fact 8240 1726773091.71258: done queuing things up, now waiting for results queue to drain 8240 1726773091.71259: waiting for pending results... 11051 1726773091.71384: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11051 1726773091.71507: in run() - task 0affffe7-6841-885f-bbcf-000000000623 11051 1726773091.71523: variable 'ansible_search_path' from source: unknown 11051 1726773091.71527: variable 'ansible_search_path' from source: unknown 11051 1726773091.71554: calling self._execute() 11051 1726773091.71630: variable 'ansible_host' from source: host vars for 'managed_node2' 11051 1726773091.71637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11051 1726773091.71647: variable 'omit' from source: magic vars 11051 1726773091.71725: variable 'omit' from source: magic vars 11051 1726773091.71756: variable 'omit' from source: magic vars 11051 1726773091.71779: variable 'omit' from source: magic vars 11051 1726773091.71816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11051 1726773091.71842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11051 1726773091.71862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11051 1726773091.71876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11051 1726773091.71890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11051 1726773091.71917: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11051 1726773091.71923: variable 'ansible_host' from source: host vars for 'managed_node2' 11051 1726773091.71927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11051 1726773091.71998: Set connection var ansible_pipelining to False 11051 1726773091.72008: Set connection var ansible_timeout to 10 11051 1726773091.72015: Set connection var ansible_module_compression to ZIP_DEFLATED 11051 1726773091.72019: Set connection var ansible_shell_type to sh 11051 1726773091.72024: Set connection var ansible_shell_executable to /bin/sh 11051 1726773091.72029: Set connection var ansible_connection to ssh 11051 1726773091.72045: variable 'ansible_shell_executable' from source: unknown 11051 1726773091.72049: variable 'ansible_connection' from source: unknown 11051 1726773091.72053: variable 'ansible_module_compression' from source: unknown 11051 1726773091.72056: variable 'ansible_shell_type' from source: unknown 11051 1726773091.72060: variable 'ansible_shell_executable' from source: unknown 11051 1726773091.72063: variable 'ansible_host' from source: host vars for 'managed_node2' 11051 1726773091.72067: variable 'ansible_pipelining' from source: unknown 11051 1726773091.72071: variable 'ansible_timeout' from source: unknown 11051 1726773091.72076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11051 1726773091.72173: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11051 1726773091.72187: variable 'omit' from source: magic vars 11051 1726773091.72193: starting attempt loop 11051 1726773091.72197: running the handler 11051 1726773091.72209: handler run complete 11051 1726773091.72219: attempt loop complete, returning result 11051 1726773091.72222: _execute() done 11051 1726773091.72225: dumping result to json 11051 1726773091.72229: done dumping result, returning 11051 1726773091.72235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000623] 11051 1726773091.72240: sending task result for task 0affffe7-6841-885f-bbcf-000000000623 11051 1726773091.72264: done sending task result for task 0affffe7-6841-885f-bbcf-000000000623 11051 1726773091.72267: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8240 1726773091.72408: no more pending results, returning what we have 8240 1726773091.72411: results queue empty 8240 1726773091.72412: checking for any_errors_fatal 8240 1726773091.72418: done checking for any_errors_fatal 8240 1726773091.72419: checking for max_fail_percentage 8240 1726773091.72420: done checking for max_fail_percentage 8240 1726773091.72421: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.72422: done checking to see if all hosts have failed 8240 1726773091.72422: getting the remaining hosts for this loop 8240 1726773091.72424: done getting the remaining hosts for this loop 8240 1726773091.72426: getting the next task for host managed_node2 8240 1726773091.72432: done getting next task for host managed_node2 8240 1726773091.72436: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8240 1726773091.72438: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.72447: getting variables 8240 1726773091.72448: in VariableManager get_vars() 8240 1726773091.72475: Calling all_inventory to load vars for managed_node2 8240 1726773091.72476: Calling groups_inventory to load vars for managed_node2 8240 1726773091.72478: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.72486: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.72488: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.72490: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.72597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.72751: done with get_vars() 8240 1726773091.72759: done getting variables 8240 1726773091.72808: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.017) 0:01:10.372 **** 8240 1726773091.72838: entering _queue_task() for managed_node2/set_fact 8240 1726773091.73026: worker is 1 (out of 1 available) 8240 1726773091.73040: exiting _queue_task() for managed_node2/set_fact 8240 1726773091.73052: done queuing things up, now waiting for results queue to drain 8240 1726773091.73053: waiting for pending results... 11052 1726773091.73275: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11052 1726773091.73415: in run() - task 0affffe7-6841-885f-bbcf-000000000624 11052 1726773091.73432: variable 'ansible_search_path' from source: unknown 11052 1726773091.73436: variable 'ansible_search_path' from source: unknown 11052 1726773091.73467: calling self._execute() 11052 1726773091.73559: variable 'ansible_host' from source: host vars for 'managed_node2' 11052 1726773091.73568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11052 1726773091.73577: variable 'omit' from source: magic vars 11052 1726773091.73679: variable 'omit' from source: magic vars 11052 1726773091.73731: variable 'omit' from source: magic vars 11052 1726773091.74102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11052 1726773091.74360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11052 1726773091.74399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11052 1726773091.74427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11052 1726773091.74453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11052 1726773091.74560: variable '__kernel_settings_register_profile' from source: set_fact 11052 1726773091.74574: variable '__kernel_settings_register_mode' from source: set_fact 11052 1726773091.74582: variable '__kernel_settings_register_apply' from source: set_fact 11052 1726773091.74625: variable 'omit' from source: magic vars 11052 1726773091.74647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11052 1726773091.74669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11052 1726773091.74686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11052 1726773091.74701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11052 1726773091.74711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11052 1726773091.74736: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11052 1726773091.74741: variable 'ansible_host' from source: host vars for 'managed_node2' 11052 1726773091.74745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11052 1726773091.74815: Set connection var ansible_pipelining to False 11052 1726773091.74823: Set connection var ansible_timeout to 10 11052 1726773091.74831: Set connection var ansible_module_compression to ZIP_DEFLATED 11052 1726773091.74835: Set connection var ansible_shell_type to sh 11052 1726773091.74839: Set connection var ansible_shell_executable to /bin/sh 11052 1726773091.74842: Set connection var ansible_connection to ssh 11052 1726773091.74855: variable 'ansible_shell_executable' from source: unknown 11052 1726773091.74858: variable 'ansible_connection' from source: unknown 11052 1726773091.74859: variable 'ansible_module_compression' from source: unknown 11052 1726773091.74861: variable 'ansible_shell_type' from source: unknown 11052 1726773091.74863: variable 'ansible_shell_executable' from source: unknown 11052 1726773091.74865: variable 'ansible_host' from source: host vars for 'managed_node2' 11052 1726773091.74867: variable 'ansible_pipelining' from source: unknown 11052 1726773091.74868: variable 'ansible_timeout' from source: unknown 11052 1726773091.74870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11052 1726773091.74937: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11052 1726773091.74948: variable 'omit' from source: magic vars 11052 1726773091.74952: starting attempt loop 11052 1726773091.74954: running the handler 11052 1726773091.74961: handler run complete 11052 1726773091.74968: attempt loop complete, returning result 11052 1726773091.74970: _execute() done 11052 1726773091.74971: dumping result to json 11052 1726773091.74973: done dumping result, returning 11052 1726773091.74977: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-885f-bbcf-000000000624] 11052 1726773091.74981: sending task result for task 0affffe7-6841-885f-bbcf-000000000624 11052 1726773091.75000: done sending task result for task 0affffe7-6841-885f-bbcf-000000000624 11052 1726773091.75002: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8240 1726773091.75341: no more pending results, returning what we have 8240 1726773091.75343: results queue empty 8240 1726773091.75344: checking for any_errors_fatal 8240 1726773091.75349: done checking for any_errors_fatal 8240 1726773091.75349: checking for max_fail_percentage 8240 1726773091.75351: done checking for max_fail_percentage 8240 1726773091.75351: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.75352: done checking to see if all hosts have failed 8240 1726773091.75352: getting the remaining hosts for this loop 8240 1726773091.75353: done getting the remaining hosts for this loop 8240 1726773091.75356: getting the next task for host managed_node2 8240 1726773091.75362: done getting next task for host managed_node2 8240 1726773091.75363: ^ task is: TASK: meta (role_complete) 8240 1726773091.75365: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.75371: getting variables 8240 1726773091.75372: in VariableManager get_vars() 8240 1726773091.75406: Calling all_inventory to load vars for managed_node2 8240 1726773091.75408: Calling groups_inventory to load vars for managed_node2 8240 1726773091.75409: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.75417: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.75419: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.75420: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.75535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.75698: done with get_vars() 8240 1726773091.75708: done getting variables 8240 1726773091.75761: done queuing things up, now waiting for results queue to drain 8240 1726773091.75763: results queue empty 8240 1726773091.75763: checking for any_errors_fatal 8240 1726773091.75766: done checking for any_errors_fatal 8240 1726773091.75766: checking for max_fail_percentage 8240 1726773091.75767: done checking for max_fail_percentage 8240 1726773091.75771: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.75772: done checking to see if all hosts have failed 8240 1726773091.75772: getting the remaining hosts for this loop 8240 1726773091.75772: done getting the remaining hosts for this loop 8240 1726773091.75774: getting the next task for host managed_node2 8240 1726773091.75776: done getting next task for host managed_node2 8240 1726773091.75778: ^ task is: TASK: meta (flush_handlers) 8240 1726773091.75778: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.75781: getting variables 8240 1726773091.75782: in VariableManager get_vars() 8240 1726773091.75792: Calling all_inventory to load vars for managed_node2 8240 1726773091.75793: Calling groups_inventory to load vars for managed_node2 8240 1726773091.75794: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.75797: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.75798: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.75800: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.75877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.75986: done with get_vars() 8240 1726773091.75992: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:159 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.031) 0:01:10.404 **** 8240 1726773091.76037: in VariableManager get_vars() 8240 1726773091.76045: Calling all_inventory to load vars for managed_node2 8240 1726773091.76046: Calling groups_inventory to load vars for managed_node2 8240 1726773091.76047: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.76050: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.76051: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.76052: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.76151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.76255: done with get_vars() META: triggered running handlers for managed_node2 8240 1726773091.76265: done queuing things up, now waiting for results queue to drain 8240 1726773091.76266: results queue empty 8240 1726773091.76267: checking for any_errors_fatal 8240 1726773091.76268: done checking for any_errors_fatal 8240 1726773091.76268: checking for max_fail_percentage 8240 1726773091.76269: done checking for max_fail_percentage 8240 1726773091.76269: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.76269: done checking to see if all hosts have failed 8240 1726773091.76270: getting the remaining hosts for this loop 8240 1726773091.76270: done getting the remaining hosts for this loop 8240 1726773091.76272: getting the next task for host managed_node2 8240 1726773091.76274: done getting next task for host managed_node2 8240 1726773091.76275: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8240 1726773091.76276: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.76277: getting variables 8240 1726773091.76278: in VariableManager get_vars() 8240 1726773091.76284: Calling all_inventory to load vars for managed_node2 8240 1726773091.76287: Calling groups_inventory to load vars for managed_node2 8240 1726773091.76288: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.76291: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.76292: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.76294: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.76415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.76541: done with get_vars() 8240 1726773091.76548: done getting variables 8240 1726773091.76581: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:162 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.005) 0:01:10.409 **** 8240 1726773091.76603: entering _queue_task() for managed_node2/assert 8240 1726773091.76831: worker is 1 (out of 1 available) 8240 1726773091.76847: exiting _queue_task() for managed_node2/assert 8240 1726773091.76859: done queuing things up, now waiting for results queue to drain 8240 1726773091.76861: waiting for pending results... 11055 1726773091.76999: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11055 1726773091.77115: in run() - task 0affffe7-6841-885f-bbcf-000000000021 11055 1726773091.77136: variable 'ansible_search_path' from source: unknown 11055 1726773091.77168: calling self._execute() 11055 1726773091.77262: variable 'ansible_host' from source: host vars for 'managed_node2' 11055 1726773091.77271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11055 1726773091.77279: variable 'omit' from source: magic vars 11055 1726773091.77374: variable 'omit' from source: magic vars 11055 1726773091.77406: variable 'omit' from source: magic vars 11055 1726773091.77437: variable 'omit' from source: magic vars 11055 1726773091.77476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11055 1726773091.77514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11055 1726773091.77536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11055 1726773091.77553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11055 1726773091.77568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11055 1726773091.77597: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11055 1726773091.77604: variable 'ansible_host' from source: host vars for 'managed_node2' 11055 1726773091.77608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11055 1726773091.77692: Set connection var ansible_pipelining to False 11055 1726773091.77698: Set connection var ansible_timeout to 10 11055 1726773091.77704: Set connection var ansible_module_compression to ZIP_DEFLATED 11055 1726773091.77706: Set connection var ansible_shell_type to sh 11055 1726773091.77709: Set connection var ansible_shell_executable to /bin/sh 11055 1726773091.77715: Set connection var ansible_connection to ssh 11055 1726773091.77737: variable 'ansible_shell_executable' from source: unknown 11055 1726773091.77741: variable 'ansible_connection' from source: unknown 11055 1726773091.77743: variable 'ansible_module_compression' from source: unknown 11055 1726773091.77746: variable 'ansible_shell_type' from source: unknown 11055 1726773091.77748: variable 'ansible_shell_executable' from source: unknown 11055 1726773091.77749: variable 'ansible_host' from source: host vars for 'managed_node2' 11055 1726773091.77751: variable 'ansible_pipelining' from source: unknown 11055 1726773091.77754: variable 'ansible_timeout' from source: unknown 11055 1726773091.77757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11055 1726773091.77926: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11055 1726773091.77939: variable 'omit' from source: magic vars 11055 1726773091.77944: starting attempt loop 11055 1726773091.77947: running the handler 11055 1726773091.78218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11055 1726773091.80108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11055 1726773091.80155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11055 1726773091.80184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11055 1726773091.80215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11055 1726773091.80237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11055 1726773091.80288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11055 1726773091.80310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11055 1726773091.80328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11055 1726773091.80358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11055 1726773091.80369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11055 1726773091.80456: variable 'kernel_settings_reboot_required' from source: set_fact 11055 1726773091.80472: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 11055 1726773091.80479: handler run complete 11055 1726773091.80497: attempt loop complete, returning result 11055 1726773091.80501: _execute() done 11055 1726773091.80504: dumping result to json 11055 1726773091.80508: done dumping result, returning 11055 1726773091.80514: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [0affffe7-6841-885f-bbcf-000000000021] 11055 1726773091.80520: sending task result for task 0affffe7-6841-885f-bbcf-000000000021 11055 1726773091.80543: done sending task result for task 0affffe7-6841-885f-bbcf-000000000021 11055 1726773091.80546: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773091.80660: no more pending results, returning what we have 8240 1726773091.80664: results queue empty 8240 1726773091.80665: checking for any_errors_fatal 8240 1726773091.80667: done checking for any_errors_fatal 8240 1726773091.80667: checking for max_fail_percentage 8240 1726773091.80669: done checking for max_fail_percentage 8240 1726773091.80670: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.80671: done checking to see if all hosts have failed 8240 1726773091.80671: getting the remaining hosts for this loop 8240 1726773091.80673: done getting the remaining hosts for this loop 8240 1726773091.80676: getting the next task for host managed_node2 8240 1726773091.80683: done getting next task for host managed_node2 8240 1726773091.80684: ^ task is: TASK: Ensure role reported changed 8240 1726773091.80688: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.80691: getting variables 8240 1726773091.80693: in VariableManager get_vars() 8240 1726773091.80731: Calling all_inventory to load vars for managed_node2 8240 1726773091.80733: Calling groups_inventory to load vars for managed_node2 8240 1726773091.80735: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.80746: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.80754: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.80757: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.80883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.81004: done with get_vars() 8240 1726773091.81012: done getting variables 8240 1726773091.81055: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:166 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.044) 0:01:10.454 **** 8240 1726773091.81076: entering _queue_task() for managed_node2/assert 8240 1726773091.81246: worker is 1 (out of 1 available) 8240 1726773091.81260: exiting _queue_task() for managed_node2/assert 8240 1726773091.81274: done queuing things up, now waiting for results queue to drain 8240 1726773091.81276: waiting for pending results... 11057 1726773091.81404: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11057 1726773091.81519: in run() - task 0affffe7-6841-885f-bbcf-000000000022 11057 1726773091.81537: variable 'ansible_search_path' from source: unknown 11057 1726773091.81567: calling self._execute() 11057 1726773091.81656: variable 'ansible_host' from source: host vars for 'managed_node2' 11057 1726773091.81759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11057 1726773091.81770: variable 'omit' from source: magic vars 11057 1726773091.81872: variable 'omit' from source: magic vars 11057 1726773091.81905: variable 'omit' from source: magic vars 11057 1726773091.81936: variable 'omit' from source: magic vars 11057 1726773091.81974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11057 1726773091.82008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11057 1726773091.82029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11057 1726773091.82045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11057 1726773091.82057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11057 1726773091.82087: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11057 1726773091.82093: variable 'ansible_host' from source: host vars for 'managed_node2' 11057 1726773091.82097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11057 1726773091.82200: Set connection var ansible_pipelining to False 11057 1726773091.82208: Set connection var ansible_timeout to 10 11057 1726773091.82216: Set connection var ansible_module_compression to ZIP_DEFLATED 11057 1726773091.82220: Set connection var ansible_shell_type to sh 11057 1726773091.82225: Set connection var ansible_shell_executable to /bin/sh 11057 1726773091.82230: Set connection var ansible_connection to ssh 11057 1726773091.82253: variable 'ansible_shell_executable' from source: unknown 11057 1726773091.82257: variable 'ansible_connection' from source: unknown 11057 1726773091.82259: variable 'ansible_module_compression' from source: unknown 11057 1726773091.82261: variable 'ansible_shell_type' from source: unknown 11057 1726773091.82263: variable 'ansible_shell_executable' from source: unknown 11057 1726773091.82265: variable 'ansible_host' from source: host vars for 'managed_node2' 11057 1726773091.82267: variable 'ansible_pipelining' from source: unknown 11057 1726773091.82268: variable 'ansible_timeout' from source: unknown 11057 1726773091.82270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11057 1726773091.82407: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11057 1726773091.82420: variable 'omit' from source: magic vars 11057 1726773091.82426: starting attempt loop 11057 1726773091.82429: running the handler 11057 1726773091.82674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11057 1726773091.84210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11057 1726773091.84262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11057 1726773091.84291: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11057 1726773091.84321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11057 1726773091.84340: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11057 1726773091.84391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11057 1726773091.84413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11057 1726773091.84434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11057 1726773091.84461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11057 1726773091.84472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11057 1726773091.84549: variable '__kernel_settings_changed' from source: set_fact 11057 1726773091.84564: Evaluated conditional (__kernel_settings_changed | d(false)): True 11057 1726773091.84571: handler run complete 11057 1726773091.84589: attempt loop complete, returning result 11057 1726773091.84593: _execute() done 11057 1726773091.84596: dumping result to json 11057 1726773091.84600: done dumping result, returning 11057 1726773091.84606: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [0affffe7-6841-885f-bbcf-000000000022] 11057 1726773091.84612: sending task result for task 0affffe7-6841-885f-bbcf-000000000022 11057 1726773091.84633: done sending task result for task 0affffe7-6841-885f-bbcf-000000000022 11057 1726773091.84636: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773091.84743: no more pending results, returning what we have 8240 1726773091.84747: results queue empty 8240 1726773091.84747: checking for any_errors_fatal 8240 1726773091.84754: done checking for any_errors_fatal 8240 1726773091.84755: checking for max_fail_percentage 8240 1726773091.84757: done checking for max_fail_percentage 8240 1726773091.84757: checking to see if all hosts have failed and the running result is not ok 8240 1726773091.84758: done checking to see if all hosts have failed 8240 1726773091.84759: getting the remaining hosts for this loop 8240 1726773091.84760: done getting the remaining hosts for this loop 8240 1726773091.84763: getting the next task for host managed_node2 8240 1726773091.84769: done getting next task for host managed_node2 8240 1726773091.84772: ^ task is: TASK: Check sysctl 8240 1726773091.84774: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773091.84776: getting variables 8240 1726773091.84777: in VariableManager get_vars() 8240 1726773091.84877: Calling all_inventory to load vars for managed_node2 8240 1726773091.84880: Calling groups_inventory to load vars for managed_node2 8240 1726773091.84881: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773091.84892: Calling all_plugins_play to load vars for managed_node2 8240 1726773091.84895: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773091.84904: Calling groups_plugins_play to load vars for managed_node2 8240 1726773091.85005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773091.85113: done with get_vars() 8240 1726773091.85121: done getting variables 8240 1726773091.85160: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl] ************************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:170 Thursday 19 September 2024 15:11:31 -0400 (0:00:00.041) 0:01:10.495 **** 8240 1726773091.85182: entering _queue_task() for managed_node2/shell 8240 1726773091.85346: worker is 1 (out of 1 available) 8240 1726773091.85360: exiting _queue_task() for managed_node2/shell 8240 1726773091.85374: done queuing things up, now waiting for results queue to drain 8240 1726773091.85376: waiting for pending results... 11059 1726773091.85499: running TaskExecutor() for managed_node2/TASK: Check sysctl 11059 1726773091.85600: in run() - task 0affffe7-6841-885f-bbcf-000000000023 11059 1726773091.85617: variable 'ansible_search_path' from source: unknown 11059 1726773091.85644: calling self._execute() 11059 1726773091.85712: variable 'ansible_host' from source: host vars for 'managed_node2' 11059 1726773091.85720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11059 1726773091.85726: variable 'omit' from source: magic vars 11059 1726773091.85805: variable 'omit' from source: magic vars 11059 1726773091.85831: variable 'omit' from source: magic vars 11059 1726773091.85853: variable 'omit' from source: magic vars 11059 1726773091.85886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11059 1726773091.85914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11059 1726773091.85935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11059 1726773091.85951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11059 1726773091.85962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11059 1726773091.85988: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11059 1726773091.85994: variable 'ansible_host' from source: host vars for 'managed_node2' 11059 1726773091.85999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11059 1726773091.86070: Set connection var ansible_pipelining to False 11059 1726773091.86077: Set connection var ansible_timeout to 10 11059 1726773091.86086: Set connection var ansible_module_compression to ZIP_DEFLATED 11059 1726773091.86089: Set connection var ansible_shell_type to sh 11059 1726773091.86095: Set connection var ansible_shell_executable to /bin/sh 11059 1726773091.86100: Set connection var ansible_connection to ssh 11059 1726773091.86116: variable 'ansible_shell_executable' from source: unknown 11059 1726773091.86121: variable 'ansible_connection' from source: unknown 11059 1726773091.86124: variable 'ansible_module_compression' from source: unknown 11059 1726773091.86128: variable 'ansible_shell_type' from source: unknown 11059 1726773091.86131: variable 'ansible_shell_executable' from source: unknown 11059 1726773091.86135: variable 'ansible_host' from source: host vars for 'managed_node2' 11059 1726773091.86139: variable 'ansible_pipelining' from source: unknown 11059 1726773091.86142: variable 'ansible_timeout' from source: unknown 11059 1726773091.86146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11059 1726773091.86237: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11059 1726773091.86248: variable 'omit' from source: magic vars 11059 1726773091.86253: starting attempt loop 11059 1726773091.86255: running the handler 11059 1726773091.86263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11059 1726773091.86277: _low_level_execute_command(): starting 11059 1726773091.86283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11059 1726773091.88634: stdout chunk (state=2): >>>/root <<< 11059 1726773091.88756: stderr chunk (state=3): >>><<< 11059 1726773091.88763: stdout chunk (state=3): >>><<< 11059 1726773091.88787: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11059 1726773091.88802: _low_level_execute_command(): starting 11059 1726773091.88810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636 `" && echo ansible-tmp-1726773091.8879633-11059-227794821608636="` echo /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636 `" ) && sleep 0' 11059 1726773091.91402: stdout chunk (state=2): >>>ansible-tmp-1726773091.8879633-11059-227794821608636=/root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636 <<< 11059 1726773091.91536: stderr chunk (state=3): >>><<< 11059 1726773091.91545: stdout chunk (state=3): >>><<< 11059 1726773091.91562: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773091.8879633-11059-227794821608636=/root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636 , stderr= 11059 1726773091.91592: variable 'ansible_module_compression' from source: unknown 11059 1726773091.91644: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11059 1726773091.91678: variable 'ansible_facts' from source: unknown 11059 1726773091.91757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/AnsiballZ_command.py 11059 1726773091.91869: Sending initial data 11059 1726773091.91876: Sent initial data (155 bytes) 11059 1726773091.94421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpn0o1u8lt /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/AnsiballZ_command.py <<< 11059 1726773091.95523: stderr chunk (state=3): >>><<< 11059 1726773091.95530: stdout chunk (state=3): >>><<< 11059 1726773091.95549: done transferring module to remote 11059 1726773091.95560: _low_level_execute_command(): starting 11059 1726773091.95565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/ /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/AnsiballZ_command.py && sleep 0' 11059 1726773091.97921: stderr chunk (state=2): >>><<< 11059 1726773091.97929: stdout chunk (state=2): >>><<< 11059 1726773091.97942: _low_level_execute_command() done: rc=0, stdout=, stderr= 11059 1726773091.97946: _low_level_execute_command(): starting 11059 1726773091.97951: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/AnsiballZ_command.py && sleep 0' 11059 1726773092.13564: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "start": "2024-09-19 15:11:32.127344", "end": "2024-09-19 15:11:32.133707", "delta": "0:00:00.006363", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11059 1726773092.14720: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11059 1726773092.14760: stderr chunk (state=3): >>><<< 11059 1726773092.14767: stdout chunk (state=3): >>><<< 11059 1726773092.14787: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "start": "2024-09-19 15:11:32.127344", "end": "2024-09-19 15:11:32.133707", "delta": "0:00:00.006363", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11059 1726773092.14831: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11059 1726773092.14843: _low_level_execute_command(): starting 11059 1726773092.14849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773091.8879633-11059-227794821608636/ > /dev/null 2>&1 && sleep 0' 11059 1726773092.17222: stderr chunk (state=2): >>><<< 11059 1726773092.17230: stdout chunk (state=2): >>><<< 11059 1726773092.17244: _low_level_execute_command() done: rc=0, stdout=, stderr= 11059 1726773092.17254: handler run complete 11059 1726773092.17273: Evaluated conditional (False): False 11059 1726773092.17283: attempt loop complete, returning result 11059 1726773092.17287: _execute() done 11059 1726773092.17291: dumping result to json 11059 1726773092.17296: done dumping result, returning 11059 1726773092.17304: done running TaskExecutor() for managed_node2/TASK: Check sysctl [0affffe7-6841-885f-bbcf-000000000023] 11059 1726773092.17310: sending task result for task 0affffe7-6841-885f-bbcf-000000000023 11059 1726773092.17340: done sending task result for task 0affffe7-6841-885f-bbcf-000000000023 11059 1726773092.17343: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -qx 400001", "delta": "0:00:00.006363", "end": "2024-09-19 15:11:32.133707", "rc": 0, "start": "2024-09-19 15:11:32.127344" } 8240 1726773092.17478: no more pending results, returning what we have 8240 1726773092.17481: results queue empty 8240 1726773092.17482: checking for any_errors_fatal 8240 1726773092.17488: done checking for any_errors_fatal 8240 1726773092.17489: checking for max_fail_percentage 8240 1726773092.17490: done checking for max_fail_percentage 8240 1726773092.17491: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.17492: done checking to see if all hosts have failed 8240 1726773092.17493: getting the remaining hosts for this loop 8240 1726773092.17494: done getting the remaining hosts for this loop 8240 1726773092.17497: getting the next task for host managed_node2 8240 1726773092.17506: done getting next task for host managed_node2 8240 1726773092.17508: ^ task is: TASK: Check sysfs after role runs 8240 1726773092.17509: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.17512: getting variables 8240 1726773092.17514: in VariableManager get_vars() 8240 1726773092.17549: Calling all_inventory to load vars for managed_node2 8240 1726773092.17552: Calling groups_inventory to load vars for managed_node2 8240 1726773092.17553: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.17563: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.17566: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.17568: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.17688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.17832: done with get_vars() 8240 1726773092.17839: done getting variables 8240 1726773092.17881: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:176 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.327) 0:01:10.822 **** 8240 1726773092.17907: entering _queue_task() for managed_node2/command 8240 1726773092.18072: worker is 1 (out of 1 available) 8240 1726773092.18089: exiting _queue_task() for managed_node2/command 8240 1726773092.18106: done queuing things up, now waiting for results queue to drain 8240 1726773092.18108: waiting for pending results... 11078 1726773092.18226: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11078 1726773092.18330: in run() - task 0affffe7-6841-885f-bbcf-000000000024 11078 1726773092.18345: variable 'ansible_search_path' from source: unknown 11078 1726773092.18376: calling self._execute() 11078 1726773092.18450: variable 'ansible_host' from source: host vars for 'managed_node2' 11078 1726773092.18458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11078 1726773092.18464: variable 'omit' from source: magic vars 11078 1726773092.18545: variable 'omit' from source: magic vars 11078 1726773092.18572: variable 'omit' from source: magic vars 11078 1726773092.18596: variable 'omit' from source: magic vars 11078 1726773092.18630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11078 1726773092.18658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11078 1726773092.18678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11078 1726773092.18695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11078 1726773092.18706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11078 1726773092.18730: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11078 1726773092.18733: variable 'ansible_host' from source: host vars for 'managed_node2' 11078 1726773092.18736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11078 1726773092.18812: Set connection var ansible_pipelining to False 11078 1726773092.18819: Set connection var ansible_timeout to 10 11078 1726773092.18827: Set connection var ansible_module_compression to ZIP_DEFLATED 11078 1726773092.18830: Set connection var ansible_shell_type to sh 11078 1726773092.18835: Set connection var ansible_shell_executable to /bin/sh 11078 1726773092.18840: Set connection var ansible_connection to ssh 11078 1726773092.18854: variable 'ansible_shell_executable' from source: unknown 11078 1726773092.18859: variable 'ansible_connection' from source: unknown 11078 1726773092.18863: variable 'ansible_module_compression' from source: unknown 11078 1726773092.18866: variable 'ansible_shell_type' from source: unknown 11078 1726773092.18870: variable 'ansible_shell_executable' from source: unknown 11078 1726773092.18874: variable 'ansible_host' from source: host vars for 'managed_node2' 11078 1726773092.18877: variable 'ansible_pipelining' from source: unknown 11078 1726773092.18879: variable 'ansible_timeout' from source: unknown 11078 1726773092.18881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11078 1726773092.18972: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11078 1726773092.18984: variable 'omit' from source: magic vars 11078 1726773092.18992: starting attempt loop 11078 1726773092.18995: running the handler 11078 1726773092.19010: _low_level_execute_command(): starting 11078 1726773092.19018: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11078 1726773092.21345: stdout chunk (state=2): >>>/root <<< 11078 1726773092.21466: stderr chunk (state=3): >>><<< 11078 1726773092.21473: stdout chunk (state=3): >>><<< 11078 1726773092.21493: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11078 1726773092.21507: _low_level_execute_command(): starting 11078 1726773092.21513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403 `" && echo ansible-tmp-1726773092.2150085-11078-69487649351403="` echo /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403 `" ) && sleep 0' 11078 1726773092.24122: stdout chunk (state=2): >>>ansible-tmp-1726773092.2150085-11078-69487649351403=/root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403 <<< 11078 1726773092.24250: stderr chunk (state=3): >>><<< 11078 1726773092.24256: stdout chunk (state=3): >>><<< 11078 1726773092.24270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.2150085-11078-69487649351403=/root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403 , stderr= 11078 1726773092.24296: variable 'ansible_module_compression' from source: unknown 11078 1726773092.24344: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11078 1726773092.24376: variable 'ansible_facts' from source: unknown 11078 1726773092.24450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/AnsiballZ_command.py 11078 1726773092.24550: Sending initial data 11078 1726773092.24557: Sent initial data (154 bytes) 11078 1726773092.27065: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpu4c9ar7h /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/AnsiballZ_command.py <<< 11078 1726773092.28163: stderr chunk (state=3): >>><<< 11078 1726773092.28170: stdout chunk (state=3): >>><<< 11078 1726773092.28192: done transferring module to remote 11078 1726773092.28203: _low_level_execute_command(): starting 11078 1726773092.28209: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/ /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/AnsiballZ_command.py && sleep 0' 11078 1726773092.30556: stderr chunk (state=2): >>><<< 11078 1726773092.30563: stdout chunk (state=2): >>><<< 11078 1726773092.30577: _low_level_execute_command() done: rc=0, stdout=, stderr= 11078 1726773092.30581: _low_level_execute_command(): starting 11078 1726773092.30587: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/AnsiballZ_command.py && sleep 0' 11078 1726773092.46070: stdout chunk (state=2): >>> {"changed": true, "stdout": "60666", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:32.455395", "end": "2024-09-19 15:11:32.458779", "delta": "0:00:00.003384", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11078 1726773092.47213: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11078 1726773092.47254: stderr chunk (state=3): >>><<< 11078 1726773092.47261: stdout chunk (state=3): >>><<< 11078 1726773092.47278: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60666", "stderr": "", "rc": 0, "cmd": ["grep", "-x", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:32.455395", "end": "2024-09-19 15:11:32.458779", "delta": "0:00:00.003384", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -x 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11078 1726773092.47321: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -x 60666 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11078 1726773092.47332: _low_level_execute_command(): starting 11078 1726773092.47338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.2150085-11078-69487649351403/ > /dev/null 2>&1 && sleep 0' 11078 1726773092.49726: stderr chunk (state=2): >>><<< 11078 1726773092.49735: stdout chunk (state=2): >>><<< 11078 1726773092.49751: _low_level_execute_command() done: rc=0, stdout=, stderr= 11078 1726773092.49758: handler run complete 11078 1726773092.49776: Evaluated conditional (False): False 11078 1726773092.49787: attempt loop complete, returning result 11078 1726773092.49791: _execute() done 11078 1726773092.49794: dumping result to json 11078 1726773092.49800: done dumping result, returning 11078 1726773092.49808: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [0affffe7-6841-885f-bbcf-000000000024] 11078 1726773092.49813: sending task result for task 0affffe7-6841-885f-bbcf-000000000024 11078 1726773092.49842: done sending task result for task 0affffe7-6841-885f-bbcf-000000000024 11078 1726773092.49845: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-x", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003384", "end": "2024-09-19 15:11:32.458779", "rc": 0, "start": "2024-09-19 15:11:32.455395" } STDOUT: 60666 8240 1726773092.49983: no more pending results, returning what we have 8240 1726773092.49987: results queue empty 8240 1726773092.49988: checking for any_errors_fatal 8240 1726773092.49997: done checking for any_errors_fatal 8240 1726773092.49997: checking for max_fail_percentage 8240 1726773092.49999: done checking for max_fail_percentage 8240 1726773092.50000: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.50003: done checking to see if all hosts have failed 8240 1726773092.50003: getting the remaining hosts for this loop 8240 1726773092.50005: done getting the remaining hosts for this loop 8240 1726773092.50008: getting the next task for host managed_node2 8240 1726773092.50014: done getting next task for host managed_node2 8240 1726773092.50016: ^ task is: TASK: Apply kernel_settings for removing section 8240 1726773092.50018: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.50021: getting variables 8240 1726773092.50023: in VariableManager get_vars() 8240 1726773092.50056: Calling all_inventory to load vars for managed_node2 8240 1726773092.50058: Calling groups_inventory to load vars for managed_node2 8240 1726773092.50060: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.50070: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.50073: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.50075: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.50192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.50313: done with get_vars() 8240 1726773092.50321: done getting variables TASK [Apply kernel_settings for removing section] ****************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:180 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.324) 0:01:11.147 **** 8240 1726773092.50389: entering _queue_task() for managed_node2/include_role 8240 1726773092.50555: worker is 1 (out of 1 available) 8240 1726773092.50569: exiting _queue_task() for managed_node2/include_role 8240 1726773092.50582: done queuing things up, now waiting for results queue to drain 8240 1726773092.50584: waiting for pending results... 11093 1726773092.50706: running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section 11093 1726773092.50815: in run() - task 0affffe7-6841-885f-bbcf-000000000025 11093 1726773092.50831: variable 'ansible_search_path' from source: unknown 11093 1726773092.50859: calling self._execute() 11093 1726773092.50931: variable 'ansible_host' from source: host vars for 'managed_node2' 11093 1726773092.50940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11093 1726773092.50949: variable 'omit' from source: magic vars 11093 1726773092.51021: _execute() done 11093 1726773092.51028: dumping result to json 11093 1726773092.51033: done dumping result, returning 11093 1726773092.51038: done running TaskExecutor() for managed_node2/TASK: Apply kernel_settings for removing section [0affffe7-6841-885f-bbcf-000000000025] 11093 1726773092.51045: sending task result for task 0affffe7-6841-885f-bbcf-000000000025 11093 1726773092.51075: done sending task result for task 0affffe7-6841-885f-bbcf-000000000025 11093 1726773092.51078: WORKER PROCESS EXITING 8240 1726773092.51199: no more pending results, returning what we have 8240 1726773092.51205: in VariableManager get_vars() 8240 1726773092.51238: Calling all_inventory to load vars for managed_node2 8240 1726773092.51241: Calling groups_inventory to load vars for managed_node2 8240 1726773092.51243: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.51252: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.51254: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.51256: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.51412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.51520: done with get_vars() 8240 1726773092.51525: variable 'ansible_search_path' from source: unknown 8240 1726773092.53650: variable 'omit' from source: magic vars 8240 1726773092.53666: variable 'omit' from source: magic vars 8240 1726773092.53676: variable 'omit' from source: magic vars 8240 1726773092.53679: we have included files to process 8240 1726773092.53680: generating all_blocks data 8240 1726773092.53681: done generating all_blocks data 8240 1726773092.53683: processing included file: fedora.linux_system_roles.kernel_settings 8240 1726773092.53699: in VariableManager get_vars() 8240 1726773092.53712: done with get_vars() 8240 1726773092.53731: in VariableManager get_vars() 8240 1726773092.53741: done with get_vars() 8240 1726773092.53768: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8240 1726773092.53809: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8240 1726773092.53828: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8240 1726773092.53872: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8240 1726773092.54358: in VariableManager get_vars() 8240 1726773092.54373: done with get_vars() 8240 1726773092.55177: in VariableManager get_vars() 8240 1726773092.55195: done with get_vars() 8240 1726773092.55296: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8240 1726773092.55680: iterating over new_blocks loaded from include file 8240 1726773092.55681: in VariableManager get_vars() 8240 1726773092.55694: done with get_vars() 8240 1726773092.55696: filtering new block on tags 8240 1726773092.55719: done filtering new block on tags 8240 1726773092.55721: in VariableManager get_vars() 8240 1726773092.55745: done with get_vars() 8240 1726773092.55747: filtering new block on tags 8240 1726773092.55769: done filtering new block on tags 8240 1726773092.55771: in VariableManager get_vars() 8240 1726773092.55780: done with get_vars() 8240 1726773092.55780: filtering new block on tags 8240 1726773092.55859: done filtering new block on tags 8240 1726773092.55861: in VariableManager get_vars() 8240 1726773092.55871: done with get_vars() 8240 1726773092.55872: filtering new block on tags 8240 1726773092.55882: done filtering new block on tags 8240 1726773092.55883: done iterating over new_blocks loaded from include file 8240 1726773092.55884: extending task lists for all hosts with included blocks 8240 1726773092.57920: done extending task lists 8240 1726773092.57921: done processing included files 8240 1726773092.57922: results queue empty 8240 1726773092.57922: checking for any_errors_fatal 8240 1726773092.57925: done checking for any_errors_fatal 8240 1726773092.57926: checking for max_fail_percentage 8240 1726773092.57926: done checking for max_fail_percentage 8240 1726773092.57927: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.57927: done checking to see if all hosts have failed 8240 1726773092.57928: getting the remaining hosts for this loop 8240 1726773092.57929: done getting the remaining hosts for this loop 8240 1726773092.57931: getting the next task for host managed_node2 8240 1726773092.57934: done getting next task for host managed_node2 8240 1726773092.57936: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8240 1726773092.57937: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.57945: getting variables 8240 1726773092.57946: in VariableManager get_vars() 8240 1726773092.57958: Calling all_inventory to load vars for managed_node2 8240 1726773092.57960: Calling groups_inventory to load vars for managed_node2 8240 1726773092.57961: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.57964: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.57966: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.57967: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.58065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.58176: done with get_vars() 8240 1726773092.58184: done getting variables 8240 1726773092.58216: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.078) 0:01:11.226 **** 8240 1726773092.58238: entering _queue_task() for managed_node2/fail 8240 1726773092.58433: worker is 1 (out of 1 available) 8240 1726773092.58448: exiting _queue_task() for managed_node2/fail 8240 1726773092.58461: done queuing things up, now waiting for results queue to drain 8240 1726773092.58463: waiting for pending results... 11094 1726773092.58595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11094 1726773092.58728: in run() - task 0affffe7-6841-885f-bbcf-0000000009ee 11094 1726773092.58743: variable 'ansible_search_path' from source: unknown 11094 1726773092.58748: variable 'ansible_search_path' from source: unknown 11094 1726773092.58777: calling self._execute() 11094 1726773092.58847: variable 'ansible_host' from source: host vars for 'managed_node2' 11094 1726773092.58856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11094 1726773092.58865: variable 'omit' from source: magic vars 11094 1726773092.59217: variable 'kernel_settings_sysctl' from source: include params 11094 1726773092.59232: variable '__kernel_settings_state_empty' from source: role '' all vars 11094 1726773092.59242: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): False 11094 1726773092.59246: when evaluation is False, skipping this task 11094 1726773092.59250: _execute() done 11094 1726773092.59254: dumping result to json 11094 1726773092.59257: done dumping result, returning 11094 1726773092.59263: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-885f-bbcf-0000000009ee] 11094 1726773092.59269: sending task result for task 0affffe7-6841-885f-bbcf-0000000009ee 11094 1726773092.59294: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009ee 11094 1726773092.59297: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "kernel_settings_sysctl != __kernel_settings_state_empty", "skip_reason": "Conditional result was False" } 8240 1726773092.59407: no more pending results, returning what we have 8240 1726773092.59411: results queue empty 8240 1726773092.59412: checking for any_errors_fatal 8240 1726773092.59413: done checking for any_errors_fatal 8240 1726773092.59414: checking for max_fail_percentage 8240 1726773092.59415: done checking for max_fail_percentage 8240 1726773092.59416: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.59417: done checking to see if all hosts have failed 8240 1726773092.59417: getting the remaining hosts for this loop 8240 1726773092.59418: done getting the remaining hosts for this loop 8240 1726773092.59422: getting the next task for host managed_node2 8240 1726773092.59428: done getting next task for host managed_node2 8240 1726773092.59432: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8240 1726773092.59434: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.59451: getting variables 8240 1726773092.59452: in VariableManager get_vars() 8240 1726773092.59486: Calling all_inventory to load vars for managed_node2 8240 1726773092.59490: Calling groups_inventory to load vars for managed_node2 8240 1726773092.59492: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.59501: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.59504: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.59507: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.59617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.59753: done with get_vars() 8240 1726773092.59760: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.015) 0:01:11.242 **** 8240 1726773092.59827: entering _queue_task() for managed_node2/include_tasks 8240 1726773092.59988: worker is 1 (out of 1 available) 8240 1726773092.60003: exiting _queue_task() for managed_node2/include_tasks 8240 1726773092.60017: done queuing things up, now waiting for results queue to drain 8240 1726773092.60018: waiting for pending results... 11095 1726773092.60148: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11095 1726773092.60257: in run() - task 0affffe7-6841-885f-bbcf-0000000009ef 11095 1726773092.60272: variable 'ansible_search_path' from source: unknown 11095 1726773092.60276: variable 'ansible_search_path' from source: unknown 11095 1726773092.60308: calling self._execute() 11095 1726773092.60374: variable 'ansible_host' from source: host vars for 'managed_node2' 11095 1726773092.60383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11095 1726773092.60394: variable 'omit' from source: magic vars 11095 1726773092.60469: _execute() done 11095 1726773092.60475: dumping result to json 11095 1726773092.60479: done dumping result, returning 11095 1726773092.60487: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-885f-bbcf-0000000009ef] 11095 1726773092.60493: sending task result for task 0affffe7-6841-885f-bbcf-0000000009ef 11095 1726773092.60520: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009ef 11095 1726773092.60523: WORKER PROCESS EXITING 8240 1726773092.60628: no more pending results, returning what we have 8240 1726773092.60632: in VariableManager get_vars() 8240 1726773092.60668: Calling all_inventory to load vars for managed_node2 8240 1726773092.60671: Calling groups_inventory to load vars for managed_node2 8240 1726773092.60672: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.60681: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.60684: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.60689: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.60795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.60910: done with get_vars() 8240 1726773092.60914: variable 'ansible_search_path' from source: unknown 8240 1726773092.60915: variable 'ansible_search_path' from source: unknown 8240 1726773092.60939: we have included files to process 8240 1726773092.60940: generating all_blocks data 8240 1726773092.60941: done generating all_blocks data 8240 1726773092.60949: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773092.60949: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773092.60951: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8240 1726773092.61392: done processing included file 8240 1726773092.61395: iterating over new_blocks loaded from include file 8240 1726773092.61396: in VariableManager get_vars() 8240 1726773092.61411: done with get_vars() 8240 1726773092.61413: filtering new block on tags 8240 1726773092.61430: done filtering new block on tags 8240 1726773092.61450: in VariableManager get_vars() 8240 1726773092.61465: done with get_vars() 8240 1726773092.61466: filtering new block on tags 8240 1726773092.61491: done filtering new block on tags 8240 1726773092.61493: in VariableManager get_vars() 8240 1726773092.61507: done with get_vars() 8240 1726773092.61508: filtering new block on tags 8240 1726773092.61530: done filtering new block on tags 8240 1726773092.61531: in VariableManager get_vars() 8240 1726773092.61544: done with get_vars() 8240 1726773092.61545: filtering new block on tags 8240 1726773092.61558: done filtering new block on tags 8240 1726773092.61559: done iterating over new_blocks loaded from include file 8240 1726773092.61560: extending task lists for all hosts with included blocks 8240 1726773092.61653: done extending task lists 8240 1726773092.61654: done processing included files 8240 1726773092.61655: results queue empty 8240 1726773092.61655: checking for any_errors_fatal 8240 1726773092.61658: done checking for any_errors_fatal 8240 1726773092.61659: checking for max_fail_percentage 8240 1726773092.61659: done checking for max_fail_percentage 8240 1726773092.61660: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.61660: done checking to see if all hosts have failed 8240 1726773092.61661: getting the remaining hosts for this loop 8240 1726773092.61661: done getting the remaining hosts for this loop 8240 1726773092.61663: getting the next task for host managed_node2 8240 1726773092.61666: done getting next task for host managed_node2 8240 1726773092.61668: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8240 1726773092.61670: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.61676: getting variables 8240 1726773092.61677: in VariableManager get_vars() 8240 1726773092.61688: Calling all_inventory to load vars for managed_node2 8240 1726773092.61690: Calling groups_inventory to load vars for managed_node2 8240 1726773092.61691: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.61695: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.61696: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.61697: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.61770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.61878: done with get_vars() 8240 1726773092.61887: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.021) 0:01:11.263 **** 8240 1726773092.61936: entering _queue_task() for managed_node2/setup 8240 1726773092.62092: worker is 1 (out of 1 available) 8240 1726773092.62105: exiting _queue_task() for managed_node2/setup 8240 1726773092.62117: done queuing things up, now waiting for results queue to drain 8240 1726773092.62119: waiting for pending results... 11096 1726773092.62245: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11096 1726773092.62366: in run() - task 0affffe7-6841-885f-bbcf-000000000bd2 11096 1726773092.62383: variable 'ansible_search_path' from source: unknown 11096 1726773092.62389: variable 'ansible_search_path' from source: unknown 11096 1726773092.62416: calling self._execute() 11096 1726773092.62479: variable 'ansible_host' from source: host vars for 'managed_node2' 11096 1726773092.62541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11096 1726773092.62547: variable 'omit' from source: magic vars 11096 1726773092.62919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11096 1726773092.64582: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11096 1726773092.64664: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11096 1726773092.64715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11096 1726773092.64750: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11096 1726773092.64775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11096 1726773092.64849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11096 1726773092.64875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11096 1726773092.64903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11096 1726773092.64942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11096 1726773092.64956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11096 1726773092.65015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11096 1726773092.65038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11096 1726773092.65062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11096 1726773092.65098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11096 1726773092.65111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11096 1726773092.65244: variable '__kernel_settings_required_facts' from source: role '' all vars 11096 1726773092.65253: variable 'ansible_facts' from source: unknown 11096 1726773092.65320: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11096 1726773092.65325: when evaluation is False, skipping this task 11096 1726773092.65328: _execute() done 11096 1726773092.65330: dumping result to json 11096 1726773092.65332: done dumping result, returning 11096 1726773092.65337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-885f-bbcf-000000000bd2] 11096 1726773092.65340: sending task result for task 0affffe7-6841-885f-bbcf-000000000bd2 11096 1726773092.65360: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bd2 11096 1726773092.65361: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8240 1726773092.65628: no more pending results, returning what we have 8240 1726773092.65631: results queue empty 8240 1726773092.65632: checking for any_errors_fatal 8240 1726773092.65633: done checking for any_errors_fatal 8240 1726773092.65634: checking for max_fail_percentage 8240 1726773092.65635: done checking for max_fail_percentage 8240 1726773092.65635: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.65636: done checking to see if all hosts have failed 8240 1726773092.65636: getting the remaining hosts for this loop 8240 1726773092.65637: done getting the remaining hosts for this loop 8240 1726773092.65640: getting the next task for host managed_node2 8240 1726773092.65647: done getting next task for host managed_node2 8240 1726773092.65650: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8240 1726773092.65652: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.65665: getting variables 8240 1726773092.65666: in VariableManager get_vars() 8240 1726773092.65738: Calling all_inventory to load vars for managed_node2 8240 1726773092.65741: Calling groups_inventory to load vars for managed_node2 8240 1726773092.65742: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.65750: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.65751: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.65753: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.65860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.65981: done with get_vars() 8240 1726773092.65991: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.041) 0:01:11.304 **** 8240 1726773092.66059: entering _queue_task() for managed_node2/stat 8240 1726773092.66223: worker is 1 (out of 1 available) 8240 1726773092.66237: exiting _queue_task() for managed_node2/stat 8240 1726773092.66250: done queuing things up, now waiting for results queue to drain 8240 1726773092.66252: waiting for pending results... 11098 1726773092.66387: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11098 1726773092.66528: in run() - task 0affffe7-6841-885f-bbcf-000000000bd4 11098 1726773092.66543: variable 'ansible_search_path' from source: unknown 11098 1726773092.66548: variable 'ansible_search_path' from source: unknown 11098 1726773092.66575: calling self._execute() 11098 1726773092.66642: variable 'ansible_host' from source: host vars for 'managed_node2' 11098 1726773092.66651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11098 1726773092.66660: variable 'omit' from source: magic vars 11098 1726773092.66994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11098 1726773092.67221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11098 1726773092.67304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11098 1726773092.67337: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11098 1726773092.67366: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11098 1726773092.67435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11098 1726773092.67457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11098 1726773092.67479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11098 1726773092.67506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11098 1726773092.67614: variable '__kernel_settings_is_ostree' from source: set_fact 11098 1726773092.67627: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11098 1726773092.67631: when evaluation is False, skipping this task 11098 1726773092.67634: _execute() done 11098 1726773092.67637: dumping result to json 11098 1726773092.67640: done dumping result, returning 11098 1726773092.67645: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-885f-bbcf-000000000bd4] 11098 1726773092.67650: sending task result for task 0affffe7-6841-885f-bbcf-000000000bd4 11098 1726773092.67677: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bd4 11098 1726773092.67681: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773092.68016: no more pending results, returning what we have 8240 1726773092.68019: results queue empty 8240 1726773092.68020: checking for any_errors_fatal 8240 1726773092.68026: done checking for any_errors_fatal 8240 1726773092.68027: checking for max_fail_percentage 8240 1726773092.68028: done checking for max_fail_percentage 8240 1726773092.68029: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.68030: done checking to see if all hosts have failed 8240 1726773092.68031: getting the remaining hosts for this loop 8240 1726773092.68032: done getting the remaining hosts for this loop 8240 1726773092.68036: getting the next task for host managed_node2 8240 1726773092.68043: done getting next task for host managed_node2 8240 1726773092.68047: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8240 1726773092.68051: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.68067: getting variables 8240 1726773092.68068: in VariableManager get_vars() 8240 1726773092.68106: Calling all_inventory to load vars for managed_node2 8240 1726773092.68108: Calling groups_inventory to load vars for managed_node2 8240 1726773092.68110: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.68120: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.68123: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.68125: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.68292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.68520: done with get_vars() 8240 1726773092.68532: done getting variables 8240 1726773092.68596: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.025) 0:01:11.330 **** 8240 1726773092.68624: entering _queue_task() for managed_node2/set_fact 8240 1726773092.68810: worker is 1 (out of 1 available) 8240 1726773092.68825: exiting _queue_task() for managed_node2/set_fact 8240 1726773092.68839: done queuing things up, now waiting for results queue to drain 8240 1726773092.68841: waiting for pending results... 11100 1726773092.69082: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11100 1726773092.69264: in run() - task 0affffe7-6841-885f-bbcf-000000000bd5 11100 1726773092.69284: variable 'ansible_search_path' from source: unknown 11100 1726773092.69290: variable 'ansible_search_path' from source: unknown 11100 1726773092.69326: calling self._execute() 11100 1726773092.69424: variable 'ansible_host' from source: host vars for 'managed_node2' 11100 1726773092.69434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11100 1726773092.69445: variable 'omit' from source: magic vars 11100 1726773092.70014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11100 1726773092.70264: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11100 1726773092.70313: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11100 1726773092.70346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11100 1726773092.70381: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11100 1726773092.70460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11100 1726773092.70488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11100 1726773092.70517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11100 1726773092.70543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11100 1726773092.70653: variable '__kernel_settings_is_ostree' from source: set_fact 11100 1726773092.70666: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11100 1726773092.70671: when evaluation is False, skipping this task 11100 1726773092.70674: _execute() done 11100 1726773092.70678: dumping result to json 11100 1726773092.70682: done dumping result, returning 11100 1726773092.70690: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-000000000bd5] 11100 1726773092.70696: sending task result for task 0affffe7-6841-885f-bbcf-000000000bd5 11100 1726773092.70728: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bd5 11100 1726773092.70732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773092.71108: no more pending results, returning what we have 8240 1726773092.71112: results queue empty 8240 1726773092.71113: checking for any_errors_fatal 8240 1726773092.71123: done checking for any_errors_fatal 8240 1726773092.71124: checking for max_fail_percentage 8240 1726773092.71125: done checking for max_fail_percentage 8240 1726773092.71126: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.71127: done checking to see if all hosts have failed 8240 1726773092.71128: getting the remaining hosts for this loop 8240 1726773092.71130: done getting the remaining hosts for this loop 8240 1726773092.71133: getting the next task for host managed_node2 8240 1726773092.71142: done getting next task for host managed_node2 8240 1726773092.71146: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8240 1726773092.71150: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.71166: getting variables 8240 1726773092.71168: in VariableManager get_vars() 8240 1726773092.71213: Calling all_inventory to load vars for managed_node2 8240 1726773092.71216: Calling groups_inventory to load vars for managed_node2 8240 1726773092.71218: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.71228: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.71231: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.71234: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.71406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.71591: done with get_vars() 8240 1726773092.71602: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.030) 0:01:11.360 **** 8240 1726773092.71668: entering _queue_task() for managed_node2/stat 8240 1726773092.71837: worker is 1 (out of 1 available) 8240 1726773092.71850: exiting _queue_task() for managed_node2/stat 8240 1726773092.71863: done queuing things up, now waiting for results queue to drain 8240 1726773092.71865: waiting for pending results... 11102 1726773092.71992: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11102 1726773092.72120: in run() - task 0affffe7-6841-885f-bbcf-000000000bd7 11102 1726773092.72136: variable 'ansible_search_path' from source: unknown 11102 1726773092.72140: variable 'ansible_search_path' from source: unknown 11102 1726773092.72165: calling self._execute() 11102 1726773092.72233: variable 'ansible_host' from source: host vars for 'managed_node2' 11102 1726773092.72241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11102 1726773092.72250: variable 'omit' from source: magic vars 11102 1726773092.72570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11102 1726773092.72803: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11102 1726773092.72836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11102 1726773092.72863: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11102 1726773092.72891: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11102 1726773092.72947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11102 1726773092.72969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11102 1726773092.72990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11102 1726773092.73010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11102 1726773092.73094: variable '__kernel_settings_is_transactional' from source: set_fact 11102 1726773092.73105: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11102 1726773092.73110: when evaluation is False, skipping this task 11102 1726773092.73114: _execute() done 11102 1726773092.73117: dumping result to json 11102 1726773092.73121: done dumping result, returning 11102 1726773092.73127: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-885f-bbcf-000000000bd7] 11102 1726773092.73132: sending task result for task 0affffe7-6841-885f-bbcf-000000000bd7 11102 1726773092.73154: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bd7 11102 1726773092.73158: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773092.73263: no more pending results, returning what we have 8240 1726773092.73266: results queue empty 8240 1726773092.73267: checking for any_errors_fatal 8240 1726773092.73272: done checking for any_errors_fatal 8240 1726773092.73273: checking for max_fail_percentage 8240 1726773092.73275: done checking for max_fail_percentage 8240 1726773092.73275: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.73276: done checking to see if all hosts have failed 8240 1726773092.73277: getting the remaining hosts for this loop 8240 1726773092.73278: done getting the remaining hosts for this loop 8240 1726773092.73281: getting the next task for host managed_node2 8240 1726773092.73289: done getting next task for host managed_node2 8240 1726773092.73293: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8240 1726773092.73296: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.73315: getting variables 8240 1726773092.73316: in VariableManager get_vars() 8240 1726773092.73350: Calling all_inventory to load vars for managed_node2 8240 1726773092.73352: Calling groups_inventory to load vars for managed_node2 8240 1726773092.73354: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.73362: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.73364: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.73365: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.73540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.73731: done with get_vars() 8240 1726773092.73741: done getting variables 8240 1726773092.73804: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.021) 0:01:11.382 **** 8240 1726773092.73839: entering _queue_task() for managed_node2/set_fact 8240 1726773092.74030: worker is 1 (out of 1 available) 8240 1726773092.74044: exiting _queue_task() for managed_node2/set_fact 8240 1726773092.74057: done queuing things up, now waiting for results queue to drain 8240 1726773092.74058: waiting for pending results... 11104 1726773092.74296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11104 1726773092.74437: in run() - task 0affffe7-6841-885f-bbcf-000000000bd8 11104 1726773092.74455: variable 'ansible_search_path' from source: unknown 11104 1726773092.74460: variable 'ansible_search_path' from source: unknown 11104 1726773092.74493: calling self._execute() 11104 1726773092.74564: variable 'ansible_host' from source: host vars for 'managed_node2' 11104 1726773092.74573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11104 1726773092.74582: variable 'omit' from source: magic vars 11104 1726773092.74914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11104 1726773092.75091: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11104 1726773092.75128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11104 1726773092.75155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11104 1726773092.75180: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11104 1726773092.75242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11104 1726773092.75262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11104 1726773092.75280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11104 1726773092.75300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11104 1726773092.75383: variable '__kernel_settings_is_transactional' from source: set_fact 11104 1726773092.75397: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11104 1726773092.75404: when evaluation is False, skipping this task 11104 1726773092.75408: _execute() done 11104 1726773092.75412: dumping result to json 11104 1726773092.75416: done dumping result, returning 11104 1726773092.75422: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-885f-bbcf-000000000bd8] 11104 1726773092.75427: sending task result for task 0affffe7-6841-885f-bbcf-000000000bd8 11104 1726773092.75451: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bd8 11104 1726773092.75454: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773092.75552: no more pending results, returning what we have 8240 1726773092.75555: results queue empty 8240 1726773092.75556: checking for any_errors_fatal 8240 1726773092.75561: done checking for any_errors_fatal 8240 1726773092.75562: checking for max_fail_percentage 8240 1726773092.75563: done checking for max_fail_percentage 8240 1726773092.75564: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.75565: done checking to see if all hosts have failed 8240 1726773092.75565: getting the remaining hosts for this loop 8240 1726773092.75567: done getting the remaining hosts for this loop 8240 1726773092.75570: getting the next task for host managed_node2 8240 1726773092.75579: done getting next task for host managed_node2 8240 1726773092.75583: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8240 1726773092.75588: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.75604: getting variables 8240 1726773092.75605: in VariableManager get_vars() 8240 1726773092.75636: Calling all_inventory to load vars for managed_node2 8240 1726773092.75638: Calling groups_inventory to load vars for managed_node2 8240 1726773092.75640: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.75649: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.75651: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.75654: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.75759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.75879: done with get_vars() 8240 1726773092.75889: done getting variables 8240 1726773092.75930: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.021) 0:01:11.403 **** 8240 1726773092.75955: entering _queue_task() for managed_node2/include_vars 8240 1726773092.76107: worker is 1 (out of 1 available) 8240 1726773092.76121: exiting _queue_task() for managed_node2/include_vars 8240 1726773092.76134: done queuing things up, now waiting for results queue to drain 8240 1726773092.76135: waiting for pending results... 11105 1726773092.76261: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11105 1726773092.76387: in run() - task 0affffe7-6841-885f-bbcf-000000000bda 11105 1726773092.76409: variable 'ansible_search_path' from source: unknown 11105 1726773092.76414: variable 'ansible_search_path' from source: unknown 11105 1726773092.76452: calling self._execute() 11105 1726773092.76536: variable 'ansible_host' from source: host vars for 'managed_node2' 11105 1726773092.76549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11105 1726773092.76557: variable 'omit' from source: magic vars 11105 1726773092.76653: variable 'omit' from source: magic vars 11105 1726773092.76718: variable 'omit' from source: magic vars 11105 1726773092.77073: variable 'ffparams' from source: task vars 11105 1726773092.77293: variable 'ansible_facts' from source: unknown 11105 1726773092.77477: variable 'ansible_facts' from source: unknown 11105 1726773092.77602: variable 'ansible_facts' from source: unknown 11105 1726773092.77687: variable 'ansible_facts' from source: unknown 11105 1726773092.77761: variable 'role_path' from source: magic vars 11105 1726773092.77921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11105 1726773092.78119: Loaded config def from plugin (lookup/first_found) 11105 1726773092.78127: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 11105 1726773092.78162: variable 'ansible_search_path' from source: unknown 11105 1726773092.78183: variable 'ansible_search_path' from source: unknown 11105 1726773092.78211: variable 'ansible_search_path' from source: unknown 11105 1726773092.78221: variable 'ansible_search_path' from source: unknown 11105 1726773092.78229: variable 'ansible_search_path' from source: unknown 11105 1726773092.78248: variable 'omit' from source: magic vars 11105 1726773092.78269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11105 1726773092.78299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11105 1726773092.78315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11105 1726773092.78327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11105 1726773092.78334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11105 1726773092.78363: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11105 1726773092.78373: variable 'ansible_host' from source: host vars for 'managed_node2' 11105 1726773092.78377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11105 1726773092.78443: Set connection var ansible_pipelining to False 11105 1726773092.78449: Set connection var ansible_timeout to 10 11105 1726773092.78454: Set connection var ansible_module_compression to ZIP_DEFLATED 11105 1726773092.78456: Set connection var ansible_shell_type to sh 11105 1726773092.78458: Set connection var ansible_shell_executable to /bin/sh 11105 1726773092.78461: Set connection var ansible_connection to ssh 11105 1726773092.78474: variable 'ansible_shell_executable' from source: unknown 11105 1726773092.78477: variable 'ansible_connection' from source: unknown 11105 1726773092.78479: variable 'ansible_module_compression' from source: unknown 11105 1726773092.78480: variable 'ansible_shell_type' from source: unknown 11105 1726773092.78482: variable 'ansible_shell_executable' from source: unknown 11105 1726773092.78484: variable 'ansible_host' from source: host vars for 'managed_node2' 11105 1726773092.78488: variable 'ansible_pipelining' from source: unknown 11105 1726773092.78490: variable 'ansible_timeout' from source: unknown 11105 1726773092.78492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11105 1726773092.78570: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11105 1726773092.78580: variable 'omit' from source: magic vars 11105 1726773092.78584: starting attempt loop 11105 1726773092.78589: running the handler 11105 1726773092.78633: handler run complete 11105 1726773092.78642: attempt loop complete, returning result 11105 1726773092.78645: _execute() done 11105 1726773092.78647: dumping result to json 11105 1726773092.78649: done dumping result, returning 11105 1726773092.78654: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-885f-bbcf-000000000bda] 11105 1726773092.78658: sending task result for task 0affffe7-6841-885f-bbcf-000000000bda 11105 1726773092.78678: done sending task result for task 0affffe7-6841-885f-bbcf-000000000bda 11105 1726773092.78680: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8240 1726773092.78980: no more pending results, returning what we have 8240 1726773092.78983: results queue empty 8240 1726773092.78983: checking for any_errors_fatal 8240 1726773092.78989: done checking for any_errors_fatal 8240 1726773092.78990: checking for max_fail_percentage 8240 1726773092.78991: done checking for max_fail_percentage 8240 1726773092.78991: checking to see if all hosts have failed and the running result is not ok 8240 1726773092.78992: done checking to see if all hosts have failed 8240 1726773092.78992: getting the remaining hosts for this loop 8240 1726773092.78993: done getting the remaining hosts for this loop 8240 1726773092.78996: getting the next task for host managed_node2 8240 1726773092.79002: done getting next task for host managed_node2 8240 1726773092.79005: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8240 1726773092.79006: ^ state is: HOST STATE: block=2, task=43, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773092.79014: getting variables 8240 1726773092.79015: in VariableManager get_vars() 8240 1726773092.79043: Calling all_inventory to load vars for managed_node2 8240 1726773092.79045: Calling groups_inventory to load vars for managed_node2 8240 1726773092.79046: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773092.79054: Calling all_plugins_play to load vars for managed_node2 8240 1726773092.79056: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773092.79057: Calling groups_plugins_play to load vars for managed_node2 8240 1726773092.79206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773092.79326: done with get_vars() 8240 1726773092.79333: done getting variables 8240 1726773092.79374: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:32 -0400 (0:00:00.034) 0:01:11.437 **** 8240 1726773092.79399: entering _queue_task() for managed_node2/package 8240 1726773092.79549: worker is 1 (out of 1 available) 8240 1726773092.79564: exiting _queue_task() for managed_node2/package 8240 1726773092.79577: done queuing things up, now waiting for results queue to drain 8240 1726773092.79579: waiting for pending results... 11108 1726773092.79709: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11108 1726773092.79827: in run() - task 0affffe7-6841-885f-bbcf-0000000009f0 11108 1726773092.79843: variable 'ansible_search_path' from source: unknown 11108 1726773092.79847: variable 'ansible_search_path' from source: unknown 11108 1726773092.79874: calling self._execute() 11108 1726773092.79942: variable 'ansible_host' from source: host vars for 'managed_node2' 11108 1726773092.79950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11108 1726773092.79958: variable 'omit' from source: magic vars 11108 1726773092.80039: variable 'omit' from source: magic vars 11108 1726773092.80073: variable 'omit' from source: magic vars 11108 1726773092.80096: variable '__kernel_settings_packages' from source: include_vars 11108 1726773092.80325: variable '__kernel_settings_packages' from source: include_vars 11108 1726773092.80516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11108 1726773092.83108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11108 1726773092.83169: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11108 1726773092.83210: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11108 1726773092.83256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11108 1726773092.83280: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11108 1726773092.83364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11108 1726773092.83384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11108 1726773092.83414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11108 1726773092.83456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11108 1726773092.83472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11108 1726773092.83574: variable '__kernel_settings_is_ostree' from source: set_fact 11108 1726773092.83582: variable 'omit' from source: magic vars 11108 1726773092.83613: variable 'omit' from source: magic vars 11108 1726773092.83639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11108 1726773092.83664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11108 1726773092.83683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11108 1726773092.83701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11108 1726773092.83712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11108 1726773092.83738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11108 1726773092.83743: variable 'ansible_host' from source: host vars for 'managed_node2' 11108 1726773092.83749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11108 1726773092.83844: Set connection var ansible_pipelining to False 11108 1726773092.83853: Set connection var ansible_timeout to 10 11108 1726773092.83862: Set connection var ansible_module_compression to ZIP_DEFLATED 11108 1726773092.83865: Set connection var ansible_shell_type to sh 11108 1726773092.83871: Set connection var ansible_shell_executable to /bin/sh 11108 1726773092.83876: Set connection var ansible_connection to ssh 11108 1726773092.83901: variable 'ansible_shell_executable' from source: unknown 11108 1726773092.83906: variable 'ansible_connection' from source: unknown 11108 1726773092.83909: variable 'ansible_module_compression' from source: unknown 11108 1726773092.83912: variable 'ansible_shell_type' from source: unknown 11108 1726773092.83914: variable 'ansible_shell_executable' from source: unknown 11108 1726773092.83917: variable 'ansible_host' from source: host vars for 'managed_node2' 11108 1726773092.83920: variable 'ansible_pipelining' from source: unknown 11108 1726773092.83923: variable 'ansible_timeout' from source: unknown 11108 1726773092.83927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11108 1726773092.84019: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11108 1726773092.84033: variable 'omit' from source: magic vars 11108 1726773092.84039: starting attempt loop 11108 1726773092.84043: running the handler 11108 1726773092.84121: variable 'ansible_facts' from source: unknown 11108 1726773092.84201: _low_level_execute_command(): starting 11108 1726773092.84210: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11108 1726773092.86567: stdout chunk (state=2): >>>/root <<< 11108 1726773092.86682: stderr chunk (state=3): >>><<< 11108 1726773092.86689: stdout chunk (state=3): >>><<< 11108 1726773092.86707: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11108 1726773092.86719: _low_level_execute_command(): starting 11108 1726773092.86724: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522 `" && echo ansible-tmp-1726773092.867152-11108-62454813631522="` echo /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522 `" ) && sleep 0' 11108 1726773092.89230: stdout chunk (state=2): >>>ansible-tmp-1726773092.867152-11108-62454813631522=/root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522 <<< 11108 1726773092.89357: stderr chunk (state=3): >>><<< 11108 1726773092.89364: stdout chunk (state=3): >>><<< 11108 1726773092.89379: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773092.867152-11108-62454813631522=/root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522 , stderr= 11108 1726773092.89406: variable 'ansible_module_compression' from source: unknown 11108 1726773092.89450: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11108 1726773092.89488: variable 'ansible_facts' from source: unknown 11108 1726773092.89577: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/AnsiballZ_dnf.py 11108 1726773092.89677: Sending initial data 11108 1726773092.89684: Sent initial data (149 bytes) 11108 1726773092.92223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmph01bxo12 /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/AnsiballZ_dnf.py <<< 11108 1726773092.93619: stderr chunk (state=3): >>><<< 11108 1726773092.93626: stdout chunk (state=3): >>><<< 11108 1726773092.93650: done transferring module to remote 11108 1726773092.93661: _low_level_execute_command(): starting 11108 1726773092.93666: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/ /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/AnsiballZ_dnf.py && sleep 0' 11108 1726773092.96007: stderr chunk (state=2): >>><<< 11108 1726773092.96022: stdout chunk (state=3): >>><<< 11108 1726773092.96033: _low_level_execute_command() done: rc=0, stdout=, stderr= 11108 1726773092.96037: _low_level_execute_command(): starting 11108 1726773092.96043: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/AnsiballZ_dnf.py && sleep 0' 11108 1726773095.52432: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11108 1726773095.60223: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11108 1726773095.60272: stderr chunk (state=3): >>><<< 11108 1726773095.60279: stdout chunk (state=3): >>><<< 11108 1726773095.60298: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11108 1726773095.60334: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11108 1726773095.60342: _low_level_execute_command(): starting 11108 1726773095.60348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773092.867152-11108-62454813631522/ > /dev/null 2>&1 && sleep 0' 11108 1726773095.62787: stderr chunk (state=2): >>><<< 11108 1726773095.62797: stdout chunk (state=2): >>><<< 11108 1726773095.62815: _low_level_execute_command() done: rc=0, stdout=, stderr= 11108 1726773095.62824: handler run complete 11108 1726773095.62851: attempt loop complete, returning result 11108 1726773095.62855: _execute() done 11108 1726773095.62858: dumping result to json 11108 1726773095.62864: done dumping result, returning 11108 1726773095.62871: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-885f-bbcf-0000000009f0] 11108 1726773095.62876: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f0 11108 1726773095.62910: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f0 11108 1726773095.62913: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773095.63075: no more pending results, returning what we have 8240 1726773095.63079: results queue empty 8240 1726773095.63080: checking for any_errors_fatal 8240 1726773095.63089: done checking for any_errors_fatal 8240 1726773095.63090: checking for max_fail_percentage 8240 1726773095.63092: done checking for max_fail_percentage 8240 1726773095.63092: checking to see if all hosts have failed and the running result is not ok 8240 1726773095.63093: done checking to see if all hosts have failed 8240 1726773095.63094: getting the remaining hosts for this loop 8240 1726773095.63095: done getting the remaining hosts for this loop 8240 1726773095.63099: getting the next task for host managed_node2 8240 1726773095.63107: done getting next task for host managed_node2 8240 1726773095.63110: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8240 1726773095.63112: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773095.63123: getting variables 8240 1726773095.63124: in VariableManager get_vars() 8240 1726773095.63158: Calling all_inventory to load vars for managed_node2 8240 1726773095.63161: Calling groups_inventory to load vars for managed_node2 8240 1726773095.63163: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773095.63174: Calling all_plugins_play to load vars for managed_node2 8240 1726773095.63177: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773095.63179: Calling groups_plugins_play to load vars for managed_node2 8240 1726773095.63294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773095.63416: done with get_vars() 8240 1726773095.63424: done getting variables 8240 1726773095.63467: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:35 -0400 (0:00:02.840) 0:01:14.278 **** 8240 1726773095.63494: entering _queue_task() for managed_node2/debug 8240 1726773095.63663: worker is 1 (out of 1 available) 8240 1726773095.63678: exiting _queue_task() for managed_node2/debug 8240 1726773095.63693: done queuing things up, now waiting for results queue to drain 8240 1726773095.63696: waiting for pending results... 11201 1726773095.63826: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11201 1726773095.63945: in run() - task 0affffe7-6841-885f-bbcf-0000000009f2 11201 1726773095.63962: variable 'ansible_search_path' from source: unknown 11201 1726773095.63965: variable 'ansible_search_path' from source: unknown 11201 1726773095.63994: calling self._execute() 11201 1726773095.64064: variable 'ansible_host' from source: host vars for 'managed_node2' 11201 1726773095.64074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11201 1726773095.64082: variable 'omit' from source: magic vars 11201 1726773095.64434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11201 1726773095.66248: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11201 1726773095.66302: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11201 1726773095.66333: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11201 1726773095.66360: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11201 1726773095.66382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11201 1726773095.66441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11201 1726773095.66462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11201 1726773095.66481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11201 1726773095.66511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11201 1726773095.66523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11201 1726773095.66603: variable '__kernel_settings_is_transactional' from source: set_fact 11201 1726773095.66620: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11201 1726773095.66625: when evaluation is False, skipping this task 11201 1726773095.66629: _execute() done 11201 1726773095.66633: dumping result to json 11201 1726773095.66637: done dumping result, returning 11201 1726773095.66644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-0000000009f2] 11201 1726773095.66649: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f2 11201 1726773095.66674: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f2 11201 1726773095.66677: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8240 1726773095.66790: no more pending results, returning what we have 8240 1726773095.66793: results queue empty 8240 1726773095.66794: checking for any_errors_fatal 8240 1726773095.66804: done checking for any_errors_fatal 8240 1726773095.66805: checking for max_fail_percentage 8240 1726773095.66807: done checking for max_fail_percentage 8240 1726773095.66807: checking to see if all hosts have failed and the running result is not ok 8240 1726773095.66808: done checking to see if all hosts have failed 8240 1726773095.66809: getting the remaining hosts for this loop 8240 1726773095.66810: done getting the remaining hosts for this loop 8240 1726773095.66814: getting the next task for host managed_node2 8240 1726773095.66824: done getting next task for host managed_node2 8240 1726773095.66828: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8240 1726773095.66830: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773095.66846: getting variables 8240 1726773095.66847: in VariableManager get_vars() 8240 1726773095.66882: Calling all_inventory to load vars for managed_node2 8240 1726773095.66886: Calling groups_inventory to load vars for managed_node2 8240 1726773095.66888: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773095.66898: Calling all_plugins_play to load vars for managed_node2 8240 1726773095.66903: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773095.66906: Calling groups_plugins_play to load vars for managed_node2 8240 1726773095.67031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773095.67195: done with get_vars() 8240 1726773095.67205: done getting variables 8240 1726773095.67247: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:35 -0400 (0:00:00.037) 0:01:14.316 **** 8240 1726773095.67272: entering _queue_task() for managed_node2/reboot 8240 1726773095.67448: worker is 1 (out of 1 available) 8240 1726773095.67462: exiting _queue_task() for managed_node2/reboot 8240 1726773095.67477: done queuing things up, now waiting for results queue to drain 8240 1726773095.67479: waiting for pending results... 11202 1726773095.67606: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11202 1726773095.67726: in run() - task 0affffe7-6841-885f-bbcf-0000000009f3 11202 1726773095.67742: variable 'ansible_search_path' from source: unknown 11202 1726773095.67746: variable 'ansible_search_path' from source: unknown 11202 1726773095.67775: calling self._execute() 11202 1726773095.67847: variable 'ansible_host' from source: host vars for 'managed_node2' 11202 1726773095.67856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11202 1726773095.67865: variable 'omit' from source: magic vars 11202 1726773095.68216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11202 1726773095.69937: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11202 1726773095.69988: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11202 1726773095.70028: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11202 1726773095.70056: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11202 1726773095.70077: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11202 1726773095.70134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11202 1726773095.70157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11202 1726773095.70176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11202 1726773095.70206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11202 1726773095.70219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11202 1726773095.70296: variable '__kernel_settings_is_transactional' from source: set_fact 11202 1726773095.70314: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11202 1726773095.70318: when evaluation is False, skipping this task 11202 1726773095.70322: _execute() done 11202 1726773095.70326: dumping result to json 11202 1726773095.70330: done dumping result, returning 11202 1726773095.70337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-885f-bbcf-0000000009f3] 11202 1726773095.70342: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f3 11202 1726773095.70366: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f3 11202 1726773095.70369: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773095.70480: no more pending results, returning what we have 8240 1726773095.70483: results queue empty 8240 1726773095.70484: checking for any_errors_fatal 8240 1726773095.70492: done checking for any_errors_fatal 8240 1726773095.70492: checking for max_fail_percentage 8240 1726773095.70494: done checking for max_fail_percentage 8240 1726773095.70494: checking to see if all hosts have failed and the running result is not ok 8240 1726773095.70495: done checking to see if all hosts have failed 8240 1726773095.70496: getting the remaining hosts for this loop 8240 1726773095.70497: done getting the remaining hosts for this loop 8240 1726773095.70504: getting the next task for host managed_node2 8240 1726773095.70512: done getting next task for host managed_node2 8240 1726773095.70516: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8240 1726773095.70518: ^ state is: HOST STATE: block=2, task=44, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773095.70535: getting variables 8240 1726773095.70536: in VariableManager get_vars() 8240 1726773095.70572: Calling all_inventory to load vars for managed_node2 8240 1726773095.70574: Calling groups_inventory to load vars for managed_node2 8240 1726773095.70576: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773095.70588: Calling all_plugins_play to load vars for managed_node2 8240 1726773095.70591: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773095.70593: Calling groups_plugins_play to load vars for managed_node2 8240 1726773095.70720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773095.70840: done with get_vars() 8240 1726773095.70849: done getting variables 8240 1726773095.70894: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:35 -0400 (0:00:00.036) 0:01:14.353 **** 8240 1726773095.70921: entering _queue_task() for managed_node2/fail 8240 1726773095.71098: worker is 1 (out of 1 available) 8240 1726773095.71116: exiting _queue_task() for managed_node2/fail 8240 1726773095.71129: done queuing things up, now waiting for results queue to drain 8240 1726773095.71131: waiting for pending results... 11203 1726773095.71259: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11203 1726773095.71380: in run() - task 0affffe7-6841-885f-bbcf-0000000009f4 11203 1726773095.71400: variable 'ansible_search_path' from source: unknown 11203 1726773095.71405: variable 'ansible_search_path' from source: unknown 11203 1726773095.71433: calling self._execute() 11203 1726773095.71502: variable 'ansible_host' from source: host vars for 'managed_node2' 11203 1726773095.71511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11203 1726773095.71520: variable 'omit' from source: magic vars 11203 1726773095.71866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11203 1726773095.73617: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11203 1726773095.73666: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11203 1726773095.73697: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11203 1726773095.73723: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11203 1726773095.73740: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11203 1726773095.73798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11203 1726773095.73831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11203 1726773095.73851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11203 1726773095.73879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11203 1726773095.73892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11203 1726773095.73966: variable '__kernel_settings_is_transactional' from source: set_fact 11203 1726773095.73983: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11203 1726773095.73990: when evaluation is False, skipping this task 11203 1726773095.73994: _execute() done 11203 1726773095.73997: dumping result to json 11203 1726773095.74001: done dumping result, returning 11203 1726773095.74008: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-885f-bbcf-0000000009f4] 11203 1726773095.74014: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f4 11203 1726773095.74038: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f4 11203 1726773095.74041: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773095.74154: no more pending results, returning what we have 8240 1726773095.74157: results queue empty 8240 1726773095.74158: checking for any_errors_fatal 8240 1726773095.74164: done checking for any_errors_fatal 8240 1726773095.74165: checking for max_fail_percentage 8240 1726773095.74166: done checking for max_fail_percentage 8240 1726773095.74167: checking to see if all hosts have failed and the running result is not ok 8240 1726773095.74168: done checking to see if all hosts have failed 8240 1726773095.74169: getting the remaining hosts for this loop 8240 1726773095.74170: done getting the remaining hosts for this loop 8240 1726773095.74174: getting the next task for host managed_node2 8240 1726773095.74183: done getting next task for host managed_node2 8240 1726773095.74189: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8240 1726773095.74191: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773095.74211: getting variables 8240 1726773095.74213: in VariableManager get_vars() 8240 1726773095.74247: Calling all_inventory to load vars for managed_node2 8240 1726773095.74249: Calling groups_inventory to load vars for managed_node2 8240 1726773095.74251: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773095.74261: Calling all_plugins_play to load vars for managed_node2 8240 1726773095.74263: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773095.74266: Calling groups_plugins_play to load vars for managed_node2 8240 1726773095.74617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773095.74730: done with get_vars() 8240 1726773095.74737: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:35 -0400 (0:00:00.038) 0:01:14.391 **** 8240 1726773095.74798: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773095.74967: worker is 1 (out of 1 available) 8240 1726773095.74981: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773095.74996: done queuing things up, now waiting for results queue to drain 8240 1726773095.74998: waiting for pending results... 11204 1726773095.75123: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11204 1726773095.75248: in run() - task 0affffe7-6841-885f-bbcf-0000000009f6 11204 1726773095.75264: variable 'ansible_search_path' from source: unknown 11204 1726773095.75268: variable 'ansible_search_path' from source: unknown 11204 1726773095.75297: calling self._execute() 11204 1726773095.75369: variable 'ansible_host' from source: host vars for 'managed_node2' 11204 1726773095.75378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11204 1726773095.75388: variable 'omit' from source: magic vars 11204 1726773095.75466: variable 'omit' from source: magic vars 11204 1726773095.75507: variable 'omit' from source: magic vars 11204 1726773095.75529: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11204 1726773095.75751: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11204 1726773095.75813: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11204 1726773095.75843: variable 'omit' from source: magic vars 11204 1726773095.75877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11204 1726773095.75906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11204 1726773095.75924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11204 1726773095.75938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11204 1726773095.75950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11204 1726773095.75974: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11204 1726773095.75979: variable 'ansible_host' from source: host vars for 'managed_node2' 11204 1726773095.75984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11204 1726773095.76054: Set connection var ansible_pipelining to False 11204 1726773095.76062: Set connection var ansible_timeout to 10 11204 1726773095.76070: Set connection var ansible_module_compression to ZIP_DEFLATED 11204 1726773095.76073: Set connection var ansible_shell_type to sh 11204 1726773095.76078: Set connection var ansible_shell_executable to /bin/sh 11204 1726773095.76083: Set connection var ansible_connection to ssh 11204 1726773095.76101: variable 'ansible_shell_executable' from source: unknown 11204 1726773095.76105: variable 'ansible_connection' from source: unknown 11204 1726773095.76107: variable 'ansible_module_compression' from source: unknown 11204 1726773095.76108: variable 'ansible_shell_type' from source: unknown 11204 1726773095.76110: variable 'ansible_shell_executable' from source: unknown 11204 1726773095.76112: variable 'ansible_host' from source: host vars for 'managed_node2' 11204 1726773095.76114: variable 'ansible_pipelining' from source: unknown 11204 1726773095.76115: variable 'ansible_timeout' from source: unknown 11204 1726773095.76117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11204 1726773095.76256: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11204 1726773095.76267: variable 'omit' from source: magic vars 11204 1726773095.76274: starting attempt loop 11204 1726773095.76277: running the handler 11204 1726773095.76290: _low_level_execute_command(): starting 11204 1726773095.76298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11204 1726773095.78661: stdout chunk (state=2): >>>/root <<< 11204 1726773095.78789: stderr chunk (state=3): >>><<< 11204 1726773095.78797: stdout chunk (state=3): >>><<< 11204 1726773095.78818: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11204 1726773095.78833: _low_level_execute_command(): starting 11204 1726773095.78840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595 `" && echo ansible-tmp-1726773095.7882779-11204-189729878746595="` echo /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595 `" ) && sleep 0' 11204 1726773095.81657: stdout chunk (state=2): >>>ansible-tmp-1726773095.7882779-11204-189729878746595=/root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595 <<< 11204 1726773095.81789: stderr chunk (state=3): >>><<< 11204 1726773095.81798: stdout chunk (state=3): >>><<< 11204 1726773095.81816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773095.7882779-11204-189729878746595=/root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595 , stderr= 11204 1726773095.81856: variable 'ansible_module_compression' from source: unknown 11204 1726773095.81895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11204 1726773095.81929: variable 'ansible_facts' from source: unknown 11204 1726773095.81998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/AnsiballZ_kernel_settings_get_config.py 11204 1726773095.82100: Sending initial data 11204 1726773095.82107: Sent initial data (174 bytes) 11204 1726773095.84730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpp45d4hoq /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/AnsiballZ_kernel_settings_get_config.py <<< 11204 1726773095.85810: stderr chunk (state=3): >>><<< 11204 1726773095.85820: stdout chunk (state=3): >>><<< 11204 1726773095.85841: done transferring module to remote 11204 1726773095.85852: _low_level_execute_command(): starting 11204 1726773095.85858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/ /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11204 1726773095.88286: stderr chunk (state=2): >>><<< 11204 1726773095.88296: stdout chunk (state=2): >>><<< 11204 1726773095.88314: _low_level_execute_command() done: rc=0, stdout=, stderr= 11204 1726773095.88319: _low_level_execute_command(): starting 11204 1726773095.88324: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11204 1726773096.04058: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11204 1726773096.05035: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11204 1726773096.05081: stderr chunk (state=3): >>><<< 11204 1726773096.05090: stdout chunk (state=3): >>><<< 11204 1726773096.05106: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 11204 1726773096.05139: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11204 1726773096.05150: _low_level_execute_command(): starting 11204 1726773096.05156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773095.7882779-11204-189729878746595/ > /dev/null 2>&1 && sleep 0' 11204 1726773096.07607: stderr chunk (state=2): >>><<< 11204 1726773096.07618: stdout chunk (state=2): >>><<< 11204 1726773096.07634: _low_level_execute_command() done: rc=0, stdout=, stderr= 11204 1726773096.07641: handler run complete 11204 1726773096.07657: attempt loop complete, returning result 11204 1726773096.07661: _execute() done 11204 1726773096.07664: dumping result to json 11204 1726773096.07669: done dumping result, returning 11204 1726773096.07677: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-885f-bbcf-0000000009f6] 11204 1726773096.07683: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f6 11204 1726773096.07717: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f6 11204 1726773096.07721: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8240 1726773096.07890: no more pending results, returning what we have 8240 1726773096.07894: results queue empty 8240 1726773096.07895: checking for any_errors_fatal 8240 1726773096.07901: done checking for any_errors_fatal 8240 1726773096.07902: checking for max_fail_percentage 8240 1726773096.07903: done checking for max_fail_percentage 8240 1726773096.07904: checking to see if all hosts have failed and the running result is not ok 8240 1726773096.07905: done checking to see if all hosts have failed 8240 1726773096.07906: getting the remaining hosts for this loop 8240 1726773096.07907: done getting the remaining hosts for this loop 8240 1726773096.07911: getting the next task for host managed_node2 8240 1726773096.07917: done getting next task for host managed_node2 8240 1726773096.07920: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8240 1726773096.07923: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773096.07933: getting variables 8240 1726773096.07935: in VariableManager get_vars() 8240 1726773096.07968: Calling all_inventory to load vars for managed_node2 8240 1726773096.07971: Calling groups_inventory to load vars for managed_node2 8240 1726773096.07973: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773096.07982: Calling all_plugins_play to load vars for managed_node2 8240 1726773096.07984: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773096.07988: Calling groups_plugins_play to load vars for managed_node2 8240 1726773096.08099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773096.08220: done with get_vars() 8240 1726773096.08229: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.334) 0:01:14.726 **** 8240 1726773096.08299: entering _queue_task() for managed_node2/stat 8240 1726773096.08467: worker is 1 (out of 1 available) 8240 1726773096.08481: exiting _queue_task() for managed_node2/stat 8240 1726773096.08494: done queuing things up, now waiting for results queue to drain 8240 1726773096.08496: waiting for pending results... 11212 1726773096.08628: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11212 1726773096.08749: in run() - task 0affffe7-6841-885f-bbcf-0000000009f7 11212 1726773096.08766: variable 'ansible_search_path' from source: unknown 11212 1726773096.08770: variable 'ansible_search_path' from source: unknown 11212 1726773096.08812: variable '__prof_from_conf' from source: task vars 11212 1726773096.09049: variable '__prof_from_conf' from source: task vars 11212 1726773096.09194: variable '__data' from source: task vars 11212 1726773096.09250: variable '__kernel_settings_register_tuned_main' from source: set_fact 11212 1726773096.09392: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11212 1726773096.09405: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11212 1726773096.09447: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11212 1726773096.09533: variable 'omit' from source: magic vars 11212 1726773096.09609: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.09619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.09628: variable 'omit' from source: magic vars 11212 1726773096.09798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11212 1726773096.11374: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11212 1726773096.11439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11212 1726773096.11470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11212 1726773096.11499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11212 1726773096.11523: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11212 1726773096.11580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11212 1726773096.11605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11212 1726773096.11630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11212 1726773096.11659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11212 1726773096.11671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11212 1726773096.11745: variable 'item' from source: unknown 11212 1726773096.11760: Evaluated conditional (item | length > 0): False 11212 1726773096.11764: when evaluation is False, skipping this task 11212 1726773096.11791: variable 'item' from source: unknown 11212 1726773096.11847: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 11212 1726773096.11932: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.11942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.11952: variable 'omit' from source: magic vars 11212 1726773096.12073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11212 1726773096.12094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11212 1726773096.12114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11212 1726773096.12142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11212 1726773096.12154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11212 1726773096.12212: variable 'item' from source: unknown 11212 1726773096.12221: Evaluated conditional (item | length > 0): True 11212 1726773096.12228: variable 'omit' from source: magic vars 11212 1726773096.12256: variable 'omit' from source: magic vars 11212 1726773096.12287: variable 'item' from source: unknown 11212 1726773096.12335: variable 'item' from source: unknown 11212 1726773096.12350: variable 'omit' from source: magic vars 11212 1726773096.12370: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11212 1726773096.12393: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11212 1726773096.12411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11212 1726773096.12425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11212 1726773096.12434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11212 1726773096.12458: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11212 1726773096.12463: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.12468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.12535: Set connection var ansible_pipelining to False 11212 1726773096.12542: Set connection var ansible_timeout to 10 11212 1726773096.12550: Set connection var ansible_module_compression to ZIP_DEFLATED 11212 1726773096.12553: Set connection var ansible_shell_type to sh 11212 1726773096.12559: Set connection var ansible_shell_executable to /bin/sh 11212 1726773096.12564: Set connection var ansible_connection to ssh 11212 1726773096.12579: variable 'ansible_shell_executable' from source: unknown 11212 1726773096.12582: variable 'ansible_connection' from source: unknown 11212 1726773096.12587: variable 'ansible_module_compression' from source: unknown 11212 1726773096.12591: variable 'ansible_shell_type' from source: unknown 11212 1726773096.12594: variable 'ansible_shell_executable' from source: unknown 11212 1726773096.12597: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.12603: variable 'ansible_pipelining' from source: unknown 11212 1726773096.12606: variable 'ansible_timeout' from source: unknown 11212 1726773096.12611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.12705: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11212 1726773096.12716: variable 'omit' from source: magic vars 11212 1726773096.12722: starting attempt loop 11212 1726773096.12725: running the handler 11212 1726773096.12737: _low_level_execute_command(): starting 11212 1726773096.12744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11212 1726773096.15090: stdout chunk (state=2): >>>/root <<< 11212 1726773096.15212: stderr chunk (state=3): >>><<< 11212 1726773096.15221: stdout chunk (state=3): >>><<< 11212 1726773096.15241: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11212 1726773096.15254: _low_level_execute_command(): starting 11212 1726773096.15260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803 `" && echo ansible-tmp-1726773096.1524925-11212-262726228672803="` echo /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803 `" ) && sleep 0' 11212 1726773096.17811: stdout chunk (state=2): >>>ansible-tmp-1726773096.1524925-11212-262726228672803=/root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803 <<< 11212 1726773096.18021: stderr chunk (state=3): >>><<< 11212 1726773096.18029: stdout chunk (state=3): >>><<< 11212 1726773096.18045: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773096.1524925-11212-262726228672803=/root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803 , stderr= 11212 1726773096.18081: variable 'ansible_module_compression' from source: unknown 11212 1726773096.18129: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11212 1726773096.18156: variable 'ansible_facts' from source: unknown 11212 1726773096.18224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/AnsiballZ_stat.py 11212 1726773096.18326: Sending initial data 11212 1726773096.18334: Sent initial data (152 bytes) 11212 1726773096.20891: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpixv1lp8e /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/AnsiballZ_stat.py <<< 11212 1726773096.22002: stderr chunk (state=3): >>><<< 11212 1726773096.22013: stdout chunk (state=3): >>><<< 11212 1726773096.22034: done transferring module to remote 11212 1726773096.22047: _low_level_execute_command(): starting 11212 1726773096.22052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/ /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/AnsiballZ_stat.py && sleep 0' 11212 1726773096.24426: stderr chunk (state=2): >>><<< 11212 1726773096.24437: stdout chunk (state=2): >>><<< 11212 1726773096.24453: _low_level_execute_command() done: rc=0, stdout=, stderr= 11212 1726773096.24457: _low_level_execute_command(): starting 11212 1726773096.24462: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/AnsiballZ_stat.py && sleep 0' 11212 1726773096.39776: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11212 1726773096.40868: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11212 1726773096.40921: stderr chunk (state=3): >>><<< 11212 1726773096.40928: stdout chunk (state=3): >>><<< 11212 1726773096.40944: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 11212 1726773096.40964: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11212 1726773096.40975: _low_level_execute_command(): starting 11212 1726773096.40980: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773096.1524925-11212-262726228672803/ > /dev/null 2>&1 && sleep 0' 11212 1726773096.43405: stderr chunk (state=2): >>><<< 11212 1726773096.43416: stdout chunk (state=2): >>><<< 11212 1726773096.43432: _low_level_execute_command() done: rc=0, stdout=, stderr= 11212 1726773096.43440: handler run complete 11212 1726773096.43455: attempt loop complete, returning result 11212 1726773096.43471: variable 'item' from source: unknown 11212 1726773096.43536: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11212 1726773096.43628: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.43638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.43648: variable 'omit' from source: magic vars 11212 1726773096.43756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11212 1726773096.43779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11212 1726773096.43799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11212 1726773096.43829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11212 1726773096.43840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11212 1726773096.43903: variable 'item' from source: unknown 11212 1726773096.43913: Evaluated conditional (item | length > 0): True 11212 1726773096.43918: variable 'omit' from source: magic vars 11212 1726773096.43930: variable 'omit' from source: magic vars 11212 1726773096.43958: variable 'item' from source: unknown 11212 1726773096.44007: variable 'item' from source: unknown 11212 1726773096.44021: variable 'omit' from source: magic vars 11212 1726773096.44038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11212 1726773096.44046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11212 1726773096.44053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11212 1726773096.44065: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11212 1726773096.44069: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.44073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.44129: Set connection var ansible_pipelining to False 11212 1726773096.44136: Set connection var ansible_timeout to 10 11212 1726773096.44143: Set connection var ansible_module_compression to ZIP_DEFLATED 11212 1726773096.44146: Set connection var ansible_shell_type to sh 11212 1726773096.44151: Set connection var ansible_shell_executable to /bin/sh 11212 1726773096.44156: Set connection var ansible_connection to ssh 11212 1726773096.44170: variable 'ansible_shell_executable' from source: unknown 11212 1726773096.44174: variable 'ansible_connection' from source: unknown 11212 1726773096.44177: variable 'ansible_module_compression' from source: unknown 11212 1726773096.44180: variable 'ansible_shell_type' from source: unknown 11212 1726773096.44184: variable 'ansible_shell_executable' from source: unknown 11212 1726773096.44189: variable 'ansible_host' from source: host vars for 'managed_node2' 11212 1726773096.44193: variable 'ansible_pipelining' from source: unknown 11212 1726773096.44196: variable 'ansible_timeout' from source: unknown 11212 1726773096.44203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11212 1726773096.44266: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11212 1726773096.44276: variable 'omit' from source: magic vars 11212 1726773096.44281: starting attempt loop 11212 1726773096.44286: running the handler 11212 1726773096.44293: _low_level_execute_command(): starting 11212 1726773096.44297: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11212 1726773096.46492: stdout chunk (state=2): >>>/root <<< 11212 1726773096.46678: stderr chunk (state=3): >>><<< 11212 1726773096.46688: stdout chunk (state=3): >>><<< 11212 1726773096.46706: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11212 1726773096.46716: _low_level_execute_command(): starting 11212 1726773096.46721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276 `" && echo ansible-tmp-1726773096.4671292-11212-165729998793276="` echo /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276 `" ) && sleep 0' 11212 1726773096.49606: stdout chunk (state=2): >>>ansible-tmp-1726773096.4671292-11212-165729998793276=/root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276 <<< 11212 1726773096.49736: stderr chunk (state=3): >>><<< 11212 1726773096.49743: stdout chunk (state=3): >>><<< 11212 1726773096.49758: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773096.4671292-11212-165729998793276=/root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276 , stderr= 11212 1726773096.49791: variable 'ansible_module_compression' from source: unknown 11212 1726773096.49827: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11212 1726773096.49845: variable 'ansible_facts' from source: unknown 11212 1726773096.49900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/AnsiballZ_stat.py 11212 1726773096.49990: Sending initial data 11212 1726773096.49997: Sent initial data (152 bytes) 11212 1726773096.52521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpmecylpgj /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/AnsiballZ_stat.py <<< 11212 1726773096.53624: stderr chunk (state=3): >>><<< 11212 1726773096.53634: stdout chunk (state=3): >>><<< 11212 1726773096.53654: done transferring module to remote 11212 1726773096.53664: _low_level_execute_command(): starting 11212 1726773096.53669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/ /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/AnsiballZ_stat.py && sleep 0' 11212 1726773096.56038: stderr chunk (state=2): >>><<< 11212 1726773096.56050: stdout chunk (state=2): >>><<< 11212 1726773096.56066: _low_level_execute_command() done: rc=0, stdout=, stderr= 11212 1726773096.56070: _low_level_execute_command(): starting 11212 1726773096.56076: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/AnsiballZ_stat.py && sleep 0' 11212 1726773096.71891: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11212 1726773096.73090: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11212 1726773096.73136: stderr chunk (state=3): >>><<< 11212 1726773096.73142: stdout chunk (state=3): >>><<< 11212 1726773096.73159: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 11212 1726773096.73195: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11212 1726773096.73206: _low_level_execute_command(): starting 11212 1726773096.73212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773096.4671292-11212-165729998793276/ > /dev/null 2>&1 && sleep 0' 11212 1726773096.75614: stderr chunk (state=2): >>><<< 11212 1726773096.75623: stdout chunk (state=2): >>><<< 11212 1726773096.75637: _low_level_execute_command() done: rc=0, stdout=, stderr= 11212 1726773096.75643: handler run complete 11212 1726773096.75673: attempt loop complete, returning result 11212 1726773096.75689: variable 'item' from source: unknown 11212 1726773096.75752: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773042.2211215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773040.2991023, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773040.2991023, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11212 1726773096.75795: dumping result to json 11212 1726773096.75808: done dumping result, returning 11212 1726773096.75817: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-885f-bbcf-0000000009f7] 11212 1726773096.75823: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f7 11212 1726773096.75861: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f7 11212 1726773096.75865: WORKER PROCESS EXITING 8240 1726773096.76115: no more pending results, returning what we have 8240 1726773096.76118: results queue empty 8240 1726773096.76119: checking for any_errors_fatal 8240 1726773096.76124: done checking for any_errors_fatal 8240 1726773096.76124: checking for max_fail_percentage 8240 1726773096.76126: done checking for max_fail_percentage 8240 1726773096.76126: checking to see if all hosts have failed and the running result is not ok 8240 1726773096.76127: done checking to see if all hosts have failed 8240 1726773096.76128: getting the remaining hosts for this loop 8240 1726773096.76129: done getting the remaining hosts for this loop 8240 1726773096.76132: getting the next task for host managed_node2 8240 1726773096.76137: done getting next task for host managed_node2 8240 1726773096.76140: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8240 1726773096.76142: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773096.76152: getting variables 8240 1726773096.76153: in VariableManager get_vars() 8240 1726773096.76176: Calling all_inventory to load vars for managed_node2 8240 1726773096.76178: Calling groups_inventory to load vars for managed_node2 8240 1726773096.76179: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773096.76189: Calling all_plugins_play to load vars for managed_node2 8240 1726773096.76191: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773096.76193: Calling groups_plugins_play to load vars for managed_node2 8240 1726773096.76292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773096.76407: done with get_vars() 8240 1726773096.76414: done getting variables 8240 1726773096.76456: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.681) 0:01:15.408 **** 8240 1726773096.76478: entering _queue_task() for managed_node2/set_fact 8240 1726773096.76644: worker is 1 (out of 1 available) 8240 1726773096.76658: exiting _queue_task() for managed_node2/set_fact 8240 1726773096.76671: done queuing things up, now waiting for results queue to drain 8240 1726773096.76673: waiting for pending results... 11230 1726773096.76809: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11230 1726773096.76930: in run() - task 0affffe7-6841-885f-bbcf-0000000009f8 11230 1726773096.76946: variable 'ansible_search_path' from source: unknown 11230 1726773096.76950: variable 'ansible_search_path' from source: unknown 11230 1726773096.76978: calling self._execute() 11230 1726773096.77053: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726773096.77062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726773096.77071: variable 'omit' from source: magic vars 11230 1726773096.77151: variable 'omit' from source: magic vars 11230 1726773096.77186: variable 'omit' from source: magic vars 11230 1726773096.77511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11230 1726773096.79062: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11230 1726773096.79117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11230 1726773096.79146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11230 1726773096.79174: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11230 1726773096.79197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11230 1726773096.79253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11230 1726773096.79284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11230 1726773096.79307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11230 1726773096.79334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11230 1726773096.79345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11230 1726773096.79377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11230 1726773096.79397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11230 1726773096.79415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11230 1726773096.79441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11230 1726773096.79451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11230 1726773096.79491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11230 1726773096.79511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11230 1726773096.79528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11230 1726773096.79553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11230 1726773096.79564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11230 1726773096.79721: variable '__kernel_settings_find_profile_dirs' from source: set_fact 11230 1726773096.79783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11230 1726773096.79892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11230 1726773096.79922: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11230 1726773096.79947: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11230 1726773096.79968: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11230 1726773096.80000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11230 1726773096.80020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11230 1726773096.80037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11230 1726773096.80055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11230 1726773096.80094: variable 'omit' from source: magic vars 11230 1726773096.80117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11230 1726773096.80137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11230 1726773096.80153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11230 1726773096.80166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11230 1726773096.80176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11230 1726773096.80203: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11230 1726773096.80208: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726773096.80213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726773096.80278: Set connection var ansible_pipelining to False 11230 1726773096.80287: Set connection var ansible_timeout to 10 11230 1726773096.80295: Set connection var ansible_module_compression to ZIP_DEFLATED 11230 1726773096.80298: Set connection var ansible_shell_type to sh 11230 1726773096.80307: Set connection var ansible_shell_executable to /bin/sh 11230 1726773096.80312: Set connection var ansible_connection to ssh 11230 1726773096.80328: variable 'ansible_shell_executable' from source: unknown 11230 1726773096.80332: variable 'ansible_connection' from source: unknown 11230 1726773096.80336: variable 'ansible_module_compression' from source: unknown 11230 1726773096.80340: variable 'ansible_shell_type' from source: unknown 11230 1726773096.80343: variable 'ansible_shell_executable' from source: unknown 11230 1726773096.80347: variable 'ansible_host' from source: host vars for 'managed_node2' 11230 1726773096.80351: variable 'ansible_pipelining' from source: unknown 11230 1726773096.80354: variable 'ansible_timeout' from source: unknown 11230 1726773096.80358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11230 1726773096.80424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11230 1726773096.80435: variable 'omit' from source: magic vars 11230 1726773096.80441: starting attempt loop 11230 1726773096.80445: running the handler 11230 1726773096.80455: handler run complete 11230 1726773096.80463: attempt loop complete, returning result 11230 1726773096.80466: _execute() done 11230 1726773096.80469: dumping result to json 11230 1726773096.80472: done dumping result, returning 11230 1726773096.80480: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-885f-bbcf-0000000009f8] 11230 1726773096.80487: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f8 11230 1726773096.80509: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f8 11230 1726773096.80513: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8240 1726773096.80644: no more pending results, returning what we have 8240 1726773096.80647: results queue empty 8240 1726773096.80648: checking for any_errors_fatal 8240 1726773096.80659: done checking for any_errors_fatal 8240 1726773096.80660: checking for max_fail_percentage 8240 1726773096.80661: done checking for max_fail_percentage 8240 1726773096.80662: checking to see if all hosts have failed and the running result is not ok 8240 1726773096.80663: done checking to see if all hosts have failed 8240 1726773096.80663: getting the remaining hosts for this loop 8240 1726773096.80665: done getting the remaining hosts for this loop 8240 1726773096.80668: getting the next task for host managed_node2 8240 1726773096.80674: done getting next task for host managed_node2 8240 1726773096.80678: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8240 1726773096.80680: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773096.80692: getting variables 8240 1726773096.80694: in VariableManager get_vars() 8240 1726773096.80727: Calling all_inventory to load vars for managed_node2 8240 1726773096.80730: Calling groups_inventory to load vars for managed_node2 8240 1726773096.80732: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773096.80741: Calling all_plugins_play to load vars for managed_node2 8240 1726773096.80744: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773096.80746: Calling groups_plugins_play to load vars for managed_node2 8240 1726773096.80864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773096.80986: done with get_vars() 8240 1726773096.80995: done getting variables 8240 1726773096.81037: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:36 -0400 (0:00:00.045) 0:01:15.454 **** 8240 1726773096.81061: entering _queue_task() for managed_node2/service 8240 1726773096.81234: worker is 1 (out of 1 available) 8240 1726773096.81248: exiting _queue_task() for managed_node2/service 8240 1726773096.81262: done queuing things up, now waiting for results queue to drain 8240 1726773096.81263: waiting for pending results... 11231 1726773096.81397: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11231 1726773096.81519: in run() - task 0affffe7-6841-885f-bbcf-0000000009f9 11231 1726773096.81535: variable 'ansible_search_path' from source: unknown 11231 1726773096.81539: variable 'ansible_search_path' from source: unknown 11231 1726773096.81574: variable '__kernel_settings_services' from source: include_vars 11231 1726773096.81879: variable '__kernel_settings_services' from source: include_vars 11231 1726773096.81939: variable 'omit' from source: magic vars 11231 1726773096.82012: variable 'ansible_host' from source: host vars for 'managed_node2' 11231 1726773096.82020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11231 1726773096.82026: variable 'omit' from source: magic vars 11231 1726773096.82075: variable 'omit' from source: magic vars 11231 1726773096.82112: variable 'omit' from source: magic vars 11231 1726773096.82144: variable 'item' from source: unknown 11231 1726773096.82199: variable 'item' from source: unknown 11231 1726773096.82223: variable 'omit' from source: magic vars 11231 1726773096.82253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11231 1726773096.82279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11231 1726773096.82298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11231 1726773096.82314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11231 1726773096.82324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11231 1726773096.82347: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11231 1726773096.82353: variable 'ansible_host' from source: host vars for 'managed_node2' 11231 1726773096.82357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11231 1726773096.82429: Set connection var ansible_pipelining to False 11231 1726773096.82436: Set connection var ansible_timeout to 10 11231 1726773096.82444: Set connection var ansible_module_compression to ZIP_DEFLATED 11231 1726773096.82447: Set connection var ansible_shell_type to sh 11231 1726773096.82453: Set connection var ansible_shell_executable to /bin/sh 11231 1726773096.82458: Set connection var ansible_connection to ssh 11231 1726773096.82474: variable 'ansible_shell_executable' from source: unknown 11231 1726773096.82478: variable 'ansible_connection' from source: unknown 11231 1726773096.82481: variable 'ansible_module_compression' from source: unknown 11231 1726773096.82486: variable 'ansible_shell_type' from source: unknown 11231 1726773096.82489: variable 'ansible_shell_executable' from source: unknown 11231 1726773096.82492: variable 'ansible_host' from source: host vars for 'managed_node2' 11231 1726773096.82497: variable 'ansible_pipelining' from source: unknown 11231 1726773096.82500: variable 'ansible_timeout' from source: unknown 11231 1726773096.82504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11231 1726773096.82592: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11231 1726773096.82603: variable 'omit' from source: magic vars 11231 1726773096.82609: starting attempt loop 11231 1726773096.82613: running the handler 11231 1726773096.82673: variable 'ansible_facts' from source: unknown 11231 1726773096.82752: _low_level_execute_command(): starting 11231 1726773096.82761: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11231 1726773096.85238: stdout chunk (state=2): >>>/root <<< 11231 1726773096.85356: stderr chunk (state=3): >>><<< 11231 1726773096.85363: stdout chunk (state=3): >>><<< 11231 1726773096.85380: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11231 1726773096.85394: _low_level_execute_command(): starting 11231 1726773096.85400: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549 `" && echo ansible-tmp-1726773096.853891-11231-73741583062549="` echo /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549 `" ) && sleep 0' 11231 1726773096.88120: stdout chunk (state=2): >>>ansible-tmp-1726773096.853891-11231-73741583062549=/root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549 <<< 11231 1726773096.88247: stderr chunk (state=3): >>><<< 11231 1726773096.88255: stdout chunk (state=3): >>><<< 11231 1726773096.88270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773096.853891-11231-73741583062549=/root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549 , stderr= 11231 1726773096.88297: variable 'ansible_module_compression' from source: unknown 11231 1726773096.88339: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11231 1726773096.88394: variable 'ansible_facts' from source: unknown 11231 1726773096.88551: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/AnsiballZ_systemd.py 11231 1726773096.88656: Sending initial data 11231 1726773096.88664: Sent initial data (153 bytes) 11231 1726773096.91278: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpzmgyvwr7 /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/AnsiballZ_systemd.py <<< 11231 1726773096.93236: stderr chunk (state=3): >>><<< 11231 1726773096.93245: stdout chunk (state=3): >>><<< 11231 1726773096.93266: done transferring module to remote 11231 1726773096.93277: _low_level_execute_command(): starting 11231 1726773096.93282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/ /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/AnsiballZ_systemd.py && sleep 0' 11231 1726773096.95686: stderr chunk (state=2): >>><<< 11231 1726773096.95696: stdout chunk (state=2): >>><<< 11231 1726773096.95712: _low_level_execute_command() done: rc=0, stdout=, stderr= 11231 1726773096.95716: _low_level_execute_command(): starting 11231 1726773096.95721: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/AnsiballZ_systemd.py && sleep 0' 11231 1726773097.24214: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21000192", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11231 1726773097.24248: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChange<<< 11231 1726773097.24326: stdout chunk (state=3): >>>TimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11231 1726773097.25925: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11231 1726773097.25972: stderr chunk (state=3): >>><<< 11231 1726773097.25978: stdout chunk (state=3): >>><<< 11231 1726773097.25999: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21000192", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11231 1726773097.26124: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11231 1726773097.26144: _low_level_execute_command(): starting 11231 1726773097.26150: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773096.853891-11231-73741583062549/ > /dev/null 2>&1 && sleep 0' 11231 1726773097.28583: stderr chunk (state=2): >>><<< 11231 1726773097.28592: stdout chunk (state=2): >>><<< 11231 1726773097.28607: _low_level_execute_command() done: rc=0, stdout=, stderr= 11231 1726773097.28615: handler run complete 11231 1726773097.28647: attempt loop complete, returning result 11231 1726773097.28664: variable 'item' from source: unknown 11231 1726773097.28730: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "21000192", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "WatchdogUSec": "0" } } 11231 1726773097.28829: dumping result to json 11231 1726773097.28848: done dumping result, returning 11231 1726773097.28857: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-0000000009f9] 11231 1726773097.28862: sending task result for task 0affffe7-6841-885f-bbcf-0000000009f9 11231 1726773097.28970: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009f9 11231 1726773097.28974: WORKER PROCESS EXITING 8240 1726773097.29345: no more pending results, returning what we have 8240 1726773097.29347: results queue empty 8240 1726773097.29348: checking for any_errors_fatal 8240 1726773097.29352: done checking for any_errors_fatal 8240 1726773097.29352: checking for max_fail_percentage 8240 1726773097.29354: done checking for max_fail_percentage 8240 1726773097.29354: checking to see if all hosts have failed and the running result is not ok 8240 1726773097.29355: done checking to see if all hosts have failed 8240 1726773097.29355: getting the remaining hosts for this loop 8240 1726773097.29356: done getting the remaining hosts for this loop 8240 1726773097.29358: getting the next task for host managed_node2 8240 1726773097.29363: done getting next task for host managed_node2 8240 1726773097.29365: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8240 1726773097.29367: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773097.29374: getting variables 8240 1726773097.29375: in VariableManager get_vars() 8240 1726773097.29400: Calling all_inventory to load vars for managed_node2 8240 1726773097.29404: Calling groups_inventory to load vars for managed_node2 8240 1726773097.29405: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773097.29413: Calling all_plugins_play to load vars for managed_node2 8240 1726773097.29415: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773097.29417: Calling groups_plugins_play to load vars for managed_node2 8240 1726773097.29518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773097.29632: done with get_vars() 8240 1726773097.29640: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.486) 0:01:15.940 **** 8240 1726773097.29708: entering _queue_task() for managed_node2/file 8240 1726773097.29870: worker is 1 (out of 1 available) 8240 1726773097.29887: exiting _queue_task() for managed_node2/file 8240 1726773097.29900: done queuing things up, now waiting for results queue to drain 8240 1726773097.29905: waiting for pending results... 11242 1726773097.30032: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11242 1726773097.30151: in run() - task 0affffe7-6841-885f-bbcf-0000000009fa 11242 1726773097.30167: variable 'ansible_search_path' from source: unknown 11242 1726773097.30171: variable 'ansible_search_path' from source: unknown 11242 1726773097.30200: calling self._execute() 11242 1726773097.30272: variable 'ansible_host' from source: host vars for 'managed_node2' 11242 1726773097.30281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11242 1726773097.30290: variable 'omit' from source: magic vars 11242 1726773097.30367: variable 'omit' from source: magic vars 11242 1726773097.30404: variable 'omit' from source: magic vars 11242 1726773097.30426: variable '__kernel_settings_profile_dir' from source: role '' all vars 11242 1726773097.30645: variable '__kernel_settings_profile_dir' from source: role '' all vars 11242 1726773097.30721: variable '__kernel_settings_profile_parent' from source: set_fact 11242 1726773097.30730: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11242 1726773097.30788: variable 'omit' from source: magic vars 11242 1726773097.30820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11242 1726773097.30846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11242 1726773097.30864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11242 1726773097.30878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11242 1726773097.30891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11242 1726773097.30915: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11242 1726773097.30921: variable 'ansible_host' from source: host vars for 'managed_node2' 11242 1726773097.30926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11242 1726773097.30994: Set connection var ansible_pipelining to False 11242 1726773097.31001: Set connection var ansible_timeout to 10 11242 1726773097.31007: Set connection var ansible_module_compression to ZIP_DEFLATED 11242 1726773097.31009: Set connection var ansible_shell_type to sh 11242 1726773097.31012: Set connection var ansible_shell_executable to /bin/sh 11242 1726773097.31015: Set connection var ansible_connection to ssh 11242 1726773097.31028: variable 'ansible_shell_executable' from source: unknown 11242 1726773097.31030: variable 'ansible_connection' from source: unknown 11242 1726773097.31032: variable 'ansible_module_compression' from source: unknown 11242 1726773097.31034: variable 'ansible_shell_type' from source: unknown 11242 1726773097.31035: variable 'ansible_shell_executable' from source: unknown 11242 1726773097.31037: variable 'ansible_host' from source: host vars for 'managed_node2' 11242 1726773097.31039: variable 'ansible_pipelining' from source: unknown 11242 1726773097.31040: variable 'ansible_timeout' from source: unknown 11242 1726773097.31042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11242 1726773097.31173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11242 1726773097.31182: variable 'omit' from source: magic vars 11242 1726773097.31189: starting attempt loop 11242 1726773097.31191: running the handler 11242 1726773097.31201: _low_level_execute_command(): starting 11242 1726773097.31207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11242 1726773097.33504: stdout chunk (state=2): >>>/root <<< 11242 1726773097.33624: stderr chunk (state=3): >>><<< 11242 1726773097.33631: stdout chunk (state=3): >>><<< 11242 1726773097.33650: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11242 1726773097.33663: _low_level_execute_command(): starting 11242 1726773097.33668: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977 `" && echo ansible-tmp-1726773097.336578-11242-197306218239977="` echo /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977 `" ) && sleep 0' 11242 1726773097.36311: stdout chunk (state=2): >>>ansible-tmp-1726773097.336578-11242-197306218239977=/root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977 <<< 11242 1726773097.36443: stderr chunk (state=3): >>><<< 11242 1726773097.36451: stdout chunk (state=3): >>><<< 11242 1726773097.36469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773097.336578-11242-197306218239977=/root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977 , stderr= 11242 1726773097.36511: variable 'ansible_module_compression' from source: unknown 11242 1726773097.36556: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11242 1726773097.36593: variable 'ansible_facts' from source: unknown 11242 1726773097.36664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/AnsiballZ_file.py 11242 1726773097.36769: Sending initial data 11242 1726773097.36777: Sent initial data (151 bytes) 11242 1726773097.39325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpku0g3eo5 /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/AnsiballZ_file.py <<< 11242 1726773097.40452: stderr chunk (state=3): >>><<< 11242 1726773097.40460: stdout chunk (state=3): >>><<< 11242 1726773097.40480: done transferring module to remote 11242 1726773097.40492: _low_level_execute_command(): starting 11242 1726773097.40497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/ /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/AnsiballZ_file.py && sleep 0' 11242 1726773097.42876: stderr chunk (state=2): >>><<< 11242 1726773097.42886: stdout chunk (state=2): >>><<< 11242 1726773097.42902: _low_level_execute_command() done: rc=0, stdout=, stderr= 11242 1726773097.42907: _low_level_execute_command(): starting 11242 1726773097.42912: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/AnsiballZ_file.py && sleep 0' 11242 1726773097.59550: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11242 1726773097.60771: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11242 1726773097.60825: stderr chunk (state=3): >>><<< 11242 1726773097.60832: stdout chunk (state=3): >>><<< 11242 1726773097.60848: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11242 1726773097.60880: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11242 1726773097.60892: _low_level_execute_command(): starting 11242 1726773097.60898: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773097.336578-11242-197306218239977/ > /dev/null 2>&1 && sleep 0' 11242 1726773097.63322: stderr chunk (state=2): >>><<< 11242 1726773097.63331: stdout chunk (state=2): >>><<< 11242 1726773097.63345: _low_level_execute_command() done: rc=0, stdout=, stderr= 11242 1726773097.63352: handler run complete 11242 1726773097.63371: attempt loop complete, returning result 11242 1726773097.63374: _execute() done 11242 1726773097.63377: dumping result to json 11242 1726773097.63383: done dumping result, returning 11242 1726773097.63393: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-0000000009fa] 11242 1726773097.63399: sending task result for task 0affffe7-6841-885f-bbcf-0000000009fa 11242 1726773097.63435: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009fa 11242 1726773097.63439: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8240 1726773097.63606: no more pending results, returning what we have 8240 1726773097.63610: results queue empty 8240 1726773097.63611: checking for any_errors_fatal 8240 1726773097.63625: done checking for any_errors_fatal 8240 1726773097.63626: checking for max_fail_percentage 8240 1726773097.63627: done checking for max_fail_percentage 8240 1726773097.63628: checking to see if all hosts have failed and the running result is not ok 8240 1726773097.63629: done checking to see if all hosts have failed 8240 1726773097.63629: getting the remaining hosts for this loop 8240 1726773097.63631: done getting the remaining hosts for this loop 8240 1726773097.63634: getting the next task for host managed_node2 8240 1726773097.63640: done getting next task for host managed_node2 8240 1726773097.63643: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8240 1726773097.63645: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773097.63656: getting variables 8240 1726773097.63658: in VariableManager get_vars() 8240 1726773097.63694: Calling all_inventory to load vars for managed_node2 8240 1726773097.63698: Calling groups_inventory to load vars for managed_node2 8240 1726773097.63699: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773097.63709: Calling all_plugins_play to load vars for managed_node2 8240 1726773097.63711: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773097.63713: Calling groups_plugins_play to load vars for managed_node2 8240 1726773097.63824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773097.63944: done with get_vars() 8240 1726773097.63952: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.343) 0:01:16.284 **** 8240 1726773097.64027: entering _queue_task() for managed_node2/slurp 8240 1726773097.64196: worker is 1 (out of 1 available) 8240 1726773097.64213: exiting _queue_task() for managed_node2/slurp 8240 1726773097.64225: done queuing things up, now waiting for results queue to drain 8240 1726773097.64227: waiting for pending results... 11250 1726773097.64352: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11250 1726773097.64471: in run() - task 0affffe7-6841-885f-bbcf-0000000009fb 11250 1726773097.64489: variable 'ansible_search_path' from source: unknown 11250 1726773097.64493: variable 'ansible_search_path' from source: unknown 11250 1726773097.64521: calling self._execute() 11250 1726773097.64596: variable 'ansible_host' from source: host vars for 'managed_node2' 11250 1726773097.64605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11250 1726773097.64611: variable 'omit' from source: magic vars 11250 1726773097.64688: variable 'omit' from source: magic vars 11250 1726773097.64723: variable 'omit' from source: magic vars 11250 1726773097.64742: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11250 1726773097.64960: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11250 1726773097.65024: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11250 1726773097.65054: variable 'omit' from source: magic vars 11250 1726773097.65088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11250 1726773097.65174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11250 1726773097.65195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11250 1726773097.65211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11250 1726773097.65222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11250 1726773097.65246: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11250 1726773097.65251: variable 'ansible_host' from source: host vars for 'managed_node2' 11250 1726773097.65256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11250 1726773097.65327: Set connection var ansible_pipelining to False 11250 1726773097.65335: Set connection var ansible_timeout to 10 11250 1726773097.65342: Set connection var ansible_module_compression to ZIP_DEFLATED 11250 1726773097.65345: Set connection var ansible_shell_type to sh 11250 1726773097.65350: Set connection var ansible_shell_executable to /bin/sh 11250 1726773097.65355: Set connection var ansible_connection to ssh 11250 1726773097.65371: variable 'ansible_shell_executable' from source: unknown 11250 1726773097.65375: variable 'ansible_connection' from source: unknown 11250 1726773097.65378: variable 'ansible_module_compression' from source: unknown 11250 1726773097.65381: variable 'ansible_shell_type' from source: unknown 11250 1726773097.65387: variable 'ansible_shell_executable' from source: unknown 11250 1726773097.65390: variable 'ansible_host' from source: host vars for 'managed_node2' 11250 1726773097.65395: variable 'ansible_pipelining' from source: unknown 11250 1726773097.65398: variable 'ansible_timeout' from source: unknown 11250 1726773097.65402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11250 1726773097.65534: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11250 1726773097.65545: variable 'omit' from source: magic vars 11250 1726773097.65549: starting attempt loop 11250 1726773097.65551: running the handler 11250 1726773097.65561: _low_level_execute_command(): starting 11250 1726773097.65566: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11250 1726773097.67852: stdout chunk (state=2): >>>/root <<< 11250 1726773097.67975: stderr chunk (state=3): >>><<< 11250 1726773097.67981: stdout chunk (state=3): >>><<< 11250 1726773097.68002: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11250 1726773097.68016: _low_level_execute_command(): starting 11250 1726773097.68022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010 `" && echo ansible-tmp-1726773097.680104-11250-213349411499010="` echo /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010 `" ) && sleep 0' 11250 1726773097.70610: stdout chunk (state=2): >>>ansible-tmp-1726773097.680104-11250-213349411499010=/root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010 <<< 11250 1726773097.70738: stderr chunk (state=3): >>><<< 11250 1726773097.70745: stdout chunk (state=3): >>><<< 11250 1726773097.70760: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773097.680104-11250-213349411499010=/root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010 , stderr= 11250 1726773097.70799: variable 'ansible_module_compression' from source: unknown 11250 1726773097.70835: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11250 1726773097.70865: variable 'ansible_facts' from source: unknown 11250 1726773097.70941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/AnsiballZ_slurp.py 11250 1726773097.71043: Sending initial data 11250 1726773097.71050: Sent initial data (152 bytes) 11250 1726773097.73576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp4_pjkj8b /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/AnsiballZ_slurp.py <<< 11250 1726773097.74674: stderr chunk (state=3): >>><<< 11250 1726773097.74684: stdout chunk (state=3): >>><<< 11250 1726773097.74708: done transferring module to remote 11250 1726773097.74720: _low_level_execute_command(): starting 11250 1726773097.74726: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/ /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/AnsiballZ_slurp.py && sleep 0' 11250 1726773097.77102: stderr chunk (state=2): >>><<< 11250 1726773097.77111: stdout chunk (state=2): >>><<< 11250 1726773097.77125: _low_level_execute_command() done: rc=0, stdout=, stderr= 11250 1726773097.77130: _low_level_execute_command(): starting 11250 1726773097.77135: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/AnsiballZ_slurp.py && sleep 0' 11250 1726773097.91946: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11250 1726773097.92935: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11250 1726773097.92983: stderr chunk (state=3): >>><<< 11250 1726773097.92992: stdout chunk (state=3): >>><<< 11250 1726773097.93010: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 11250 1726773097.93035: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11250 1726773097.93046: _low_level_execute_command(): starting 11250 1726773097.93052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773097.680104-11250-213349411499010/ > /dev/null 2>&1 && sleep 0' 11250 1726773097.95487: stderr chunk (state=2): >>><<< 11250 1726773097.95499: stdout chunk (state=2): >>><<< 11250 1726773097.95517: _low_level_execute_command() done: rc=0, stdout=, stderr= 11250 1726773097.95525: handler run complete 11250 1726773097.95539: attempt loop complete, returning result 11250 1726773097.95543: _execute() done 11250 1726773097.95546: dumping result to json 11250 1726773097.95550: done dumping result, returning 11250 1726773097.95558: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-885f-bbcf-0000000009fb] 11250 1726773097.95564: sending task result for task 0affffe7-6841-885f-bbcf-0000000009fb 11250 1726773097.95596: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009fb 11250 1726773097.95600: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773097.95767: no more pending results, returning what we have 8240 1726773097.95770: results queue empty 8240 1726773097.95771: checking for any_errors_fatal 8240 1726773097.95779: done checking for any_errors_fatal 8240 1726773097.95780: checking for max_fail_percentage 8240 1726773097.95781: done checking for max_fail_percentage 8240 1726773097.95782: checking to see if all hosts have failed and the running result is not ok 8240 1726773097.95783: done checking to see if all hosts have failed 8240 1726773097.95783: getting the remaining hosts for this loop 8240 1726773097.95786: done getting the remaining hosts for this loop 8240 1726773097.95790: getting the next task for host managed_node2 8240 1726773097.95796: done getting next task for host managed_node2 8240 1726773097.95800: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8240 1726773097.95802: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773097.95813: getting variables 8240 1726773097.95814: in VariableManager get_vars() 8240 1726773097.95849: Calling all_inventory to load vars for managed_node2 8240 1726773097.95852: Calling groups_inventory to load vars for managed_node2 8240 1726773097.95854: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773097.95863: Calling all_plugins_play to load vars for managed_node2 8240 1726773097.95866: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773097.95867: Calling groups_plugins_play to load vars for managed_node2 8240 1726773097.95982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773097.96151: done with get_vars() 8240 1726773097.96159: done getting variables 8240 1726773097.96205: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:37 -0400 (0:00:00.322) 0:01:16.606 **** 8240 1726773097.96229: entering _queue_task() for managed_node2/set_fact 8240 1726773097.96399: worker is 1 (out of 1 available) 8240 1726773097.96415: exiting _queue_task() for managed_node2/set_fact 8240 1726773097.96429: done queuing things up, now waiting for results queue to drain 8240 1726773097.96431: waiting for pending results... 11258 1726773097.96568: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11258 1726773097.96694: in run() - task 0affffe7-6841-885f-bbcf-0000000009fc 11258 1726773097.96713: variable 'ansible_search_path' from source: unknown 11258 1726773097.96717: variable 'ansible_search_path' from source: unknown 11258 1726773097.96745: calling self._execute() 11258 1726773097.96819: variable 'ansible_host' from source: host vars for 'managed_node2' 11258 1726773097.96828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11258 1726773097.96836: variable 'omit' from source: magic vars 11258 1726773097.96916: variable 'omit' from source: magic vars 11258 1726773097.96950: variable 'omit' from source: magic vars 11258 1726773097.97256: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11258 1726773097.97265: variable '__cur_profile' from source: task vars 11258 1726773097.97373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11258 1726773097.98902: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11258 1726773097.98959: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11258 1726773097.98990: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11258 1726773097.99020: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11258 1726773097.99040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11258 1726773097.99098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11258 1726773097.99122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11258 1726773097.99140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11258 1726773097.99165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11258 1726773097.99175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11258 1726773097.99253: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11258 1726773097.99295: variable 'omit' from source: magic vars 11258 1726773097.99319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11258 1726773097.99340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11258 1726773097.99356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11258 1726773097.99370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11258 1726773097.99380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11258 1726773097.99407: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11258 1726773097.99413: variable 'ansible_host' from source: host vars for 'managed_node2' 11258 1726773097.99417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11258 1726773097.99483: Set connection var ansible_pipelining to False 11258 1726773097.99492: Set connection var ansible_timeout to 10 11258 1726773097.99500: Set connection var ansible_module_compression to ZIP_DEFLATED 11258 1726773097.99506: Set connection var ansible_shell_type to sh 11258 1726773097.99511: Set connection var ansible_shell_executable to /bin/sh 11258 1726773097.99516: Set connection var ansible_connection to ssh 11258 1726773097.99536: variable 'ansible_shell_executable' from source: unknown 11258 1726773097.99540: variable 'ansible_connection' from source: unknown 11258 1726773097.99544: variable 'ansible_module_compression' from source: unknown 11258 1726773097.99547: variable 'ansible_shell_type' from source: unknown 11258 1726773097.99550: variable 'ansible_shell_executable' from source: unknown 11258 1726773097.99554: variable 'ansible_host' from source: host vars for 'managed_node2' 11258 1726773097.99559: variable 'ansible_pipelining' from source: unknown 11258 1726773097.99562: variable 'ansible_timeout' from source: unknown 11258 1726773097.99566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11258 1726773097.99632: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11258 1726773097.99643: variable 'omit' from source: magic vars 11258 1726773097.99649: starting attempt loop 11258 1726773097.99652: running the handler 11258 1726773097.99662: handler run complete 11258 1726773097.99670: attempt loop complete, returning result 11258 1726773097.99673: _execute() done 11258 1726773097.99676: dumping result to json 11258 1726773097.99679: done dumping result, returning 11258 1726773097.99688: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-885f-bbcf-0000000009fc] 11258 1726773097.99694: sending task result for task 0affffe7-6841-885f-bbcf-0000000009fc 11258 1726773097.99717: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009fc 11258 1726773097.99720: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8240 1726773097.99855: no more pending results, returning what we have 8240 1726773097.99858: results queue empty 8240 1726773097.99859: checking for any_errors_fatal 8240 1726773097.99866: done checking for any_errors_fatal 8240 1726773097.99866: checking for max_fail_percentage 8240 1726773097.99868: done checking for max_fail_percentage 8240 1726773097.99869: checking to see if all hosts have failed and the running result is not ok 8240 1726773097.99870: done checking to see if all hosts have failed 8240 1726773097.99870: getting the remaining hosts for this loop 8240 1726773097.99871: done getting the remaining hosts for this loop 8240 1726773097.99874: getting the next task for host managed_node2 8240 1726773097.99881: done getting next task for host managed_node2 8240 1726773097.99884: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8240 1726773097.99888: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773097.99904: getting variables 8240 1726773097.99906: in VariableManager get_vars() 8240 1726773097.99938: Calling all_inventory to load vars for managed_node2 8240 1726773097.99941: Calling groups_inventory to load vars for managed_node2 8240 1726773097.99943: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773097.99952: Calling all_plugins_play to load vars for managed_node2 8240 1726773097.99955: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773097.99957: Calling groups_plugins_play to load vars for managed_node2 8240 1726773098.00071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773098.00193: done with get_vars() 8240 1726773098.00201: done getting variables 8240 1726773098.00243: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.040) 0:01:16.646 **** 8240 1726773098.00266: entering _queue_task() for managed_node2/copy 8240 1726773098.00429: worker is 1 (out of 1 available) 8240 1726773098.00445: exiting _queue_task() for managed_node2/copy 8240 1726773098.00458: done queuing things up, now waiting for results queue to drain 8240 1726773098.00460: waiting for pending results... 11259 1726773098.00592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11259 1726773098.00714: in run() - task 0affffe7-6841-885f-bbcf-0000000009fd 11259 1726773098.00730: variable 'ansible_search_path' from source: unknown 11259 1726773098.00734: variable 'ansible_search_path' from source: unknown 11259 1726773098.00760: calling self._execute() 11259 1726773098.00835: variable 'ansible_host' from source: host vars for 'managed_node2' 11259 1726773098.00844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11259 1726773098.00853: variable 'omit' from source: magic vars 11259 1726773098.00930: variable 'omit' from source: magic vars 11259 1726773098.00965: variable 'omit' from source: magic vars 11259 1726773098.00988: variable '__kernel_settings_active_profile' from source: set_fact 11259 1726773098.01209: variable '__kernel_settings_active_profile' from source: set_fact 11259 1726773098.01233: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11259 1726773098.01284: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11259 1726773098.01341: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11259 1726773098.01423: variable 'omit' from source: magic vars 11259 1726773098.01457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11259 1726773098.01483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11259 1726773098.01504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11259 1726773098.01518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11259 1726773098.01529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11259 1726773098.01553: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11259 1726773098.01558: variable 'ansible_host' from source: host vars for 'managed_node2' 11259 1726773098.01563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11259 1726773098.01633: Set connection var ansible_pipelining to False 11259 1726773098.01640: Set connection var ansible_timeout to 10 11259 1726773098.01648: Set connection var ansible_module_compression to ZIP_DEFLATED 11259 1726773098.01652: Set connection var ansible_shell_type to sh 11259 1726773098.01657: Set connection var ansible_shell_executable to /bin/sh 11259 1726773098.01663: Set connection var ansible_connection to ssh 11259 1726773098.01678: variable 'ansible_shell_executable' from source: unknown 11259 1726773098.01682: variable 'ansible_connection' from source: unknown 11259 1726773098.01687: variable 'ansible_module_compression' from source: unknown 11259 1726773098.01690: variable 'ansible_shell_type' from source: unknown 11259 1726773098.01694: variable 'ansible_shell_executable' from source: unknown 11259 1726773098.01697: variable 'ansible_host' from source: host vars for 'managed_node2' 11259 1726773098.01703: variable 'ansible_pipelining' from source: unknown 11259 1726773098.01706: variable 'ansible_timeout' from source: unknown 11259 1726773098.01711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11259 1726773098.01802: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11259 1726773098.01814: variable 'omit' from source: magic vars 11259 1726773098.01820: starting attempt loop 11259 1726773098.01824: running the handler 11259 1726773098.01836: _low_level_execute_command(): starting 11259 1726773098.01844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11259 1726773098.04206: stdout chunk (state=2): >>>/root <<< 11259 1726773098.04333: stderr chunk (state=3): >>><<< 11259 1726773098.04342: stdout chunk (state=3): >>><<< 11259 1726773098.04362: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11259 1726773098.04377: _low_level_execute_command(): starting 11259 1726773098.04383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854 `" && echo ansible-tmp-1726773098.0437067-11259-98639202229854="` echo /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854 `" ) && sleep 0' 11259 1726773098.06994: stdout chunk (state=2): >>>ansible-tmp-1726773098.0437067-11259-98639202229854=/root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854 <<< 11259 1726773098.07131: stderr chunk (state=3): >>><<< 11259 1726773098.07139: stdout chunk (state=3): >>><<< 11259 1726773098.07155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773098.0437067-11259-98639202229854=/root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854 , stderr= 11259 1726773098.07234: variable 'ansible_module_compression' from source: unknown 11259 1726773098.07277: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11259 1726773098.07313: variable 'ansible_facts' from source: unknown 11259 1726773098.07380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_stat.py 11259 1726773098.07470: Sending initial data 11259 1726773098.07477: Sent initial data (151 bytes) 11259 1726773098.09975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpviusoemv /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_stat.py <<< 11259 1726773098.11073: stderr chunk (state=3): >>><<< 11259 1726773098.11081: stdout chunk (state=3): >>><<< 11259 1726773098.11106: done transferring module to remote 11259 1726773098.11118: _low_level_execute_command(): starting 11259 1726773098.11123: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/ /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_stat.py && sleep 0' 11259 1726773098.13508: stderr chunk (state=2): >>><<< 11259 1726773098.13517: stdout chunk (state=2): >>><<< 11259 1726773098.13531: _low_level_execute_command() done: rc=0, stdout=, stderr= 11259 1726773098.13536: _low_level_execute_command(): starting 11259 1726773098.13541: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_stat.py && sleep 0' 11259 1726773098.29877: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773097.9171095, "mtime": 1726773090.1230319, "ctime": 1726773090.1230319, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11259 1726773098.31027: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11259 1726773098.31075: stderr chunk (state=3): >>><<< 11259 1726773098.31082: stdout chunk (state=3): >>><<< 11259 1726773098.31100: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773097.9171095, "mtime": 1726773090.1230319, "ctime": 1726773090.1230319, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11259 1726773098.31143: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11259 1726773098.31181: variable 'ansible_module_compression' from source: unknown 11259 1726773098.31214: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11259 1726773098.31235: variable 'ansible_facts' from source: unknown 11259 1726773098.31293: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_file.py 11259 1726773098.31383: Sending initial data 11259 1726773098.31395: Sent initial data (151 bytes) 11259 1726773098.33950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpfttrlyys /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_file.py <<< 11259 1726773098.35095: stderr chunk (state=3): >>><<< 11259 1726773098.35107: stdout chunk (state=3): >>><<< 11259 1726773098.35127: done transferring module to remote 11259 1726773098.35136: _low_level_execute_command(): starting 11259 1726773098.35141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/ /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_file.py && sleep 0' 11259 1726773098.37539: stderr chunk (state=2): >>><<< 11259 1726773098.37548: stdout chunk (state=2): >>><<< 11259 1726773098.37565: _low_level_execute_command() done: rc=0, stdout=, stderr= 11259 1726773098.37570: _low_level_execute_command(): starting 11259 1726773098.37576: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/AnsiballZ_file.py && sleep 0' 11259 1726773098.53471: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpodbntnm6", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11259 1726773098.54584: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11259 1726773098.54636: stderr chunk (state=3): >>><<< 11259 1726773098.54643: stdout chunk (state=3): >>><<< 11259 1726773098.54659: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpodbntnm6", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11259 1726773098.54687: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpodbntnm6', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11259 1726773098.54698: _low_level_execute_command(): starting 11259 1726773098.54704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773098.0437067-11259-98639202229854/ > /dev/null 2>&1 && sleep 0' 11259 1726773098.57142: stderr chunk (state=2): >>><<< 11259 1726773098.57151: stdout chunk (state=2): >>><<< 11259 1726773098.57165: _low_level_execute_command() done: rc=0, stdout=, stderr= 11259 1726773098.57175: handler run complete 11259 1726773098.57199: attempt loop complete, returning result 11259 1726773098.57203: _execute() done 11259 1726773098.57206: dumping result to json 11259 1726773098.57212: done dumping result, returning 11259 1726773098.57220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-885f-bbcf-0000000009fd] 11259 1726773098.57225: sending task result for task 0affffe7-6841-885f-bbcf-0000000009fd 11259 1726773098.57259: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009fd 11259 1726773098.57262: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8240 1726773098.57436: no more pending results, returning what we have 8240 1726773098.57440: results queue empty 8240 1726773098.57441: checking for any_errors_fatal 8240 1726773098.57446: done checking for any_errors_fatal 8240 1726773098.57447: checking for max_fail_percentage 8240 1726773098.57448: done checking for max_fail_percentage 8240 1726773098.57449: checking to see if all hosts have failed and the running result is not ok 8240 1726773098.57450: done checking to see if all hosts have failed 8240 1726773098.57450: getting the remaining hosts for this loop 8240 1726773098.57451: done getting the remaining hosts for this loop 8240 1726773098.57455: getting the next task for host managed_node2 8240 1726773098.57461: done getting next task for host managed_node2 8240 1726773098.57464: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8240 1726773098.57466: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773098.57477: getting variables 8240 1726773098.57478: in VariableManager get_vars() 8240 1726773098.57517: Calling all_inventory to load vars for managed_node2 8240 1726773098.57520: Calling groups_inventory to load vars for managed_node2 8240 1726773098.57521: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773098.57530: Calling all_plugins_play to load vars for managed_node2 8240 1726773098.57532: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773098.57533: Calling groups_plugins_play to load vars for managed_node2 8240 1726773098.57688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773098.57806: done with get_vars() 8240 1726773098.57815: done getting variables 8240 1726773098.57857: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:38 -0400 (0:00:00.576) 0:01:17.222 **** 8240 1726773098.57879: entering _queue_task() for managed_node2/copy 8240 1726773098.58050: worker is 1 (out of 1 available) 8240 1726773098.58067: exiting _queue_task() for managed_node2/copy 8240 1726773098.58080: done queuing things up, now waiting for results queue to drain 8240 1726773098.58082: waiting for pending results... 11274 1726773098.58214: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11274 1726773098.58339: in run() - task 0affffe7-6841-885f-bbcf-0000000009fe 11274 1726773098.58356: variable 'ansible_search_path' from source: unknown 11274 1726773098.58360: variable 'ansible_search_path' from source: unknown 11274 1726773098.58391: calling self._execute() 11274 1726773098.58463: variable 'ansible_host' from source: host vars for 'managed_node2' 11274 1726773098.58471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11274 1726773098.58481: variable 'omit' from source: magic vars 11274 1726773098.58558: variable 'omit' from source: magic vars 11274 1726773098.58594: variable 'omit' from source: magic vars 11274 1726773098.58617: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11274 1726773098.58839: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11274 1726773098.58902: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11274 1726773098.58930: variable 'omit' from source: magic vars 11274 1726773098.58963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11274 1726773098.58991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11274 1726773098.59010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11274 1726773098.59025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11274 1726773098.59036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11274 1726773098.59060: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11274 1726773098.59065: variable 'ansible_host' from source: host vars for 'managed_node2' 11274 1726773098.59069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11274 1726773098.59143: Set connection var ansible_pipelining to False 11274 1726773098.59151: Set connection var ansible_timeout to 10 11274 1726773098.59158: Set connection var ansible_module_compression to ZIP_DEFLATED 11274 1726773098.59161: Set connection var ansible_shell_type to sh 11274 1726773098.59167: Set connection var ansible_shell_executable to /bin/sh 11274 1726773098.59173: Set connection var ansible_connection to ssh 11274 1726773098.59192: variable 'ansible_shell_executable' from source: unknown 11274 1726773098.59196: variable 'ansible_connection' from source: unknown 11274 1726773098.59199: variable 'ansible_module_compression' from source: unknown 11274 1726773098.59202: variable 'ansible_shell_type' from source: unknown 11274 1726773098.59205: variable 'ansible_shell_executable' from source: unknown 11274 1726773098.59208: variable 'ansible_host' from source: host vars for 'managed_node2' 11274 1726773098.59212: variable 'ansible_pipelining' from source: unknown 11274 1726773098.59216: variable 'ansible_timeout' from source: unknown 11274 1726773098.59220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11274 1726773098.59310: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11274 1726773098.59323: variable 'omit' from source: magic vars 11274 1726773098.59330: starting attempt loop 11274 1726773098.59333: running the handler 11274 1726773098.59344: _low_level_execute_command(): starting 11274 1726773098.59352: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11274 1726773098.61661: stdout chunk (state=2): >>>/root <<< 11274 1726773098.61789: stderr chunk (state=3): >>><<< 11274 1726773098.61796: stdout chunk (state=3): >>><<< 11274 1726773098.61816: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11274 1726773098.61830: _low_level_execute_command(): starting 11274 1726773098.61836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853 `" && echo ansible-tmp-1726773098.6182492-11274-255831349399853="` echo /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853 `" ) && sleep 0' 11274 1726773098.64368: stdout chunk (state=2): >>>ansible-tmp-1726773098.6182492-11274-255831349399853=/root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853 <<< 11274 1726773098.64501: stderr chunk (state=3): >>><<< 11274 1726773098.64509: stdout chunk (state=3): >>><<< 11274 1726773098.64524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773098.6182492-11274-255831349399853=/root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853 , stderr= 11274 1726773098.64597: variable 'ansible_module_compression' from source: unknown 11274 1726773098.64642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11274 1726773098.64673: variable 'ansible_facts' from source: unknown 11274 1726773098.64743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_stat.py 11274 1726773098.64831: Sending initial data 11274 1726773098.64838: Sent initial data (152 bytes) 11274 1726773098.67310: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpstz9e12o /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_stat.py <<< 11274 1726773098.68411: stderr chunk (state=3): >>><<< 11274 1726773098.68419: stdout chunk (state=3): >>><<< 11274 1726773098.68439: done transferring module to remote 11274 1726773098.68450: _low_level_execute_command(): starting 11274 1726773098.68455: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/ /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_stat.py && sleep 0' 11274 1726773098.70818: stderr chunk (state=2): >>><<< 11274 1726773098.70828: stdout chunk (state=2): >>><<< 11274 1726773098.70842: _low_level_execute_command() done: rc=0, stdout=, stderr= 11274 1726773098.70846: _low_level_execute_command(): starting 11274 1726773098.70853: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_stat.py && sleep 0' 11274 1726773098.86883: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773088.2910137, "mtime": 1726773090.1230319, "ctime": 1726773090.1230319, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11274 1726773098.88061: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11274 1726773098.88111: stderr chunk (state=3): >>><<< 11274 1726773098.88119: stdout chunk (state=3): >>><<< 11274 1726773098.88136: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773088.2910137, "mtime": 1726773090.1230319, "ctime": 1726773090.1230319, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11274 1726773098.88177: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11274 1726773098.88218: variable 'ansible_module_compression' from source: unknown 11274 1726773098.88251: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11274 1726773098.88272: variable 'ansible_facts' from source: unknown 11274 1726773098.88332: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_file.py 11274 1726773098.88423: Sending initial data 11274 1726773098.88430: Sent initial data (152 bytes) 11274 1726773098.90976: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp4bgldzfm /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_file.py <<< 11274 1726773098.92106: stderr chunk (state=3): >>><<< 11274 1726773098.92114: stdout chunk (state=3): >>><<< 11274 1726773098.92132: done transferring module to remote 11274 1726773098.92141: _low_level_execute_command(): starting 11274 1726773098.92146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/ /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_file.py && sleep 0' 11274 1726773098.94498: stderr chunk (state=2): >>><<< 11274 1726773098.94507: stdout chunk (state=2): >>><<< 11274 1726773098.94521: _low_level_execute_command() done: rc=0, stdout=, stderr= 11274 1726773098.94525: _low_level_execute_command(): starting 11274 1726773098.94530: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/AnsiballZ_file.py && sleep 0' 11274 1726773099.10771: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpqcxvidkm", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11274 1726773099.11916: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11274 1726773099.11966: stderr chunk (state=3): >>><<< 11274 1726773099.11974: stdout chunk (state=3): >>><<< 11274 1726773099.11993: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmpqcxvidkm", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11274 1726773099.12022: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmpqcxvidkm', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11274 1726773099.12033: _low_level_execute_command(): starting 11274 1726773099.12038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773098.6182492-11274-255831349399853/ > /dev/null 2>&1 && sleep 0' 11274 1726773099.14474: stderr chunk (state=2): >>><<< 11274 1726773099.14483: stdout chunk (state=2): >>><<< 11274 1726773099.14502: _low_level_execute_command() done: rc=0, stdout=, stderr= 11274 1726773099.14513: handler run complete 11274 1726773099.14534: attempt loop complete, returning result 11274 1726773099.14538: _execute() done 11274 1726773099.14541: dumping result to json 11274 1726773099.14547: done dumping result, returning 11274 1726773099.14554: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-885f-bbcf-0000000009fe] 11274 1726773099.14559: sending task result for task 0affffe7-6841-885f-bbcf-0000000009fe 11274 1726773099.14594: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009fe 11274 1726773099.14598: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8240 1726773099.14764: no more pending results, returning what we have 8240 1726773099.14767: results queue empty 8240 1726773099.14768: checking for any_errors_fatal 8240 1726773099.14777: done checking for any_errors_fatal 8240 1726773099.14778: checking for max_fail_percentage 8240 1726773099.14779: done checking for max_fail_percentage 8240 1726773099.14780: checking to see if all hosts have failed and the running result is not ok 8240 1726773099.14781: done checking to see if all hosts have failed 8240 1726773099.14782: getting the remaining hosts for this loop 8240 1726773099.14783: done getting the remaining hosts for this loop 8240 1726773099.14788: getting the next task for host managed_node2 8240 1726773099.14795: done getting next task for host managed_node2 8240 1726773099.14798: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8240 1726773099.14800: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773099.14810: getting variables 8240 1726773099.14812: in VariableManager get_vars() 8240 1726773099.14845: Calling all_inventory to load vars for managed_node2 8240 1726773099.14848: Calling groups_inventory to load vars for managed_node2 8240 1726773099.14850: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773099.14860: Calling all_plugins_play to load vars for managed_node2 8240 1726773099.14862: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773099.14865: Calling groups_plugins_play to load vars for managed_node2 8240 1726773099.14975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773099.15095: done with get_vars() 8240 1726773099.15103: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:39 -0400 (0:00:00.572) 0:01:17.795 **** 8240 1726773099.15162: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773099.15324: worker is 1 (out of 1 available) 8240 1726773099.15339: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773099.15354: done queuing things up, now waiting for results queue to drain 8240 1726773099.15355: waiting for pending results... 11286 1726773099.15489: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11286 1726773099.15610: in run() - task 0affffe7-6841-885f-bbcf-0000000009ff 11286 1726773099.15626: variable 'ansible_search_path' from source: unknown 11286 1726773099.15630: variable 'ansible_search_path' from source: unknown 11286 1726773099.15657: calling self._execute() 11286 1726773099.15729: variable 'ansible_host' from source: host vars for 'managed_node2' 11286 1726773099.15737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11286 1726773099.15747: variable 'omit' from source: magic vars 11286 1726773099.15825: variable 'omit' from source: magic vars 11286 1726773099.15861: variable 'omit' from source: magic vars 11286 1726773099.15882: variable '__kernel_settings_profile_filename' from source: role '' all vars 11286 1726773099.16103: variable '__kernel_settings_profile_filename' from source: role '' all vars 11286 1726773099.16163: variable '__kernel_settings_profile_dir' from source: role '' all vars 11286 1726773099.16228: variable '__kernel_settings_profile_parent' from source: set_fact 11286 1726773099.16238: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11286 1726773099.16335: variable 'omit' from source: magic vars 11286 1726773099.16366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11286 1726773099.16394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11286 1726773099.16414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11286 1726773099.16429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11286 1726773099.16440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11286 1726773099.16465: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11286 1726773099.16470: variable 'ansible_host' from source: host vars for 'managed_node2' 11286 1726773099.16475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11286 1726773099.16553: Set connection var ansible_pipelining to False 11286 1726773099.16562: Set connection var ansible_timeout to 10 11286 1726773099.16569: Set connection var ansible_module_compression to ZIP_DEFLATED 11286 1726773099.16572: Set connection var ansible_shell_type to sh 11286 1726773099.16578: Set connection var ansible_shell_executable to /bin/sh 11286 1726773099.16582: Set connection var ansible_connection to ssh 11286 1726773099.16622: variable 'ansible_shell_executable' from source: unknown 11286 1726773099.16628: variable 'ansible_connection' from source: unknown 11286 1726773099.16631: variable 'ansible_module_compression' from source: unknown 11286 1726773099.16634: variable 'ansible_shell_type' from source: unknown 11286 1726773099.16636: variable 'ansible_shell_executable' from source: unknown 11286 1726773099.16639: variable 'ansible_host' from source: host vars for 'managed_node2' 11286 1726773099.16642: variable 'ansible_pipelining' from source: unknown 11286 1726773099.16644: variable 'ansible_timeout' from source: unknown 11286 1726773099.16647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11286 1726773099.16817: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11286 1726773099.16829: variable 'omit' from source: magic vars 11286 1726773099.16835: starting attempt loop 11286 1726773099.16838: running the handler 11286 1726773099.16852: _low_level_execute_command(): starting 11286 1726773099.16859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11286 1726773099.19390: stdout chunk (state=2): >>>/root <<< 11286 1726773099.19510: stderr chunk (state=3): >>><<< 11286 1726773099.19517: stdout chunk (state=3): >>><<< 11286 1726773099.19534: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11286 1726773099.19546: _low_level_execute_command(): starting 11286 1726773099.19552: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951 `" && echo ansible-tmp-1726773099.1954145-11286-190239000203951="` echo /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951 `" ) && sleep 0' 11286 1726773099.22115: stdout chunk (state=2): >>>ansible-tmp-1726773099.1954145-11286-190239000203951=/root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951 <<< 11286 1726773099.22244: stderr chunk (state=3): >>><<< 11286 1726773099.22251: stdout chunk (state=3): >>><<< 11286 1726773099.22267: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773099.1954145-11286-190239000203951=/root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951 , stderr= 11286 1726773099.22308: variable 'ansible_module_compression' from source: unknown 11286 1726773099.22340: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11286 1726773099.22375: variable 'ansible_facts' from source: unknown 11286 1726773099.22442: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/AnsiballZ_kernel_settings_get_config.py 11286 1726773099.22542: Sending initial data 11286 1726773099.22549: Sent initial data (174 bytes) 11286 1726773099.25051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpsepn5fb6 /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/AnsiballZ_kernel_settings_get_config.py <<< 11286 1726773099.26128: stderr chunk (state=3): >>><<< 11286 1726773099.26136: stdout chunk (state=3): >>><<< 11286 1726773099.26156: done transferring module to remote 11286 1726773099.26167: _low_level_execute_command(): starting 11286 1726773099.26172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/ /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11286 1726773099.28528: stderr chunk (state=2): >>><<< 11286 1726773099.28537: stdout chunk (state=2): >>><<< 11286 1726773099.28552: _low_level_execute_command() done: rc=0, stdout=, stderr= 11286 1726773099.28556: _low_level_execute_command(): starting 11286 1726773099.28561: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11286 1726773099.44191: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11286 1726773099.45251: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11286 1726773099.45302: stderr chunk (state=3): >>><<< 11286 1726773099.45309: stdout chunk (state=3): >>><<< 11286 1726773099.45327: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "sysctl": {"fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530"}, "sysfs": {"/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 11286 1726773099.45354: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11286 1726773099.45365: _low_level_execute_command(): starting 11286 1726773099.45371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773099.1954145-11286-190239000203951/ > /dev/null 2>&1 && sleep 0' 11286 1726773099.47811: stderr chunk (state=2): >>><<< 11286 1726773099.47819: stdout chunk (state=2): >>><<< 11286 1726773099.47834: _low_level_execute_command() done: rc=0, stdout=, stderr= 11286 1726773099.47841: handler run complete 11286 1726773099.47857: attempt loop complete, returning result 11286 1726773099.47861: _execute() done 11286 1726773099.47864: dumping result to json 11286 1726773099.47869: done dumping result, returning 11286 1726773099.47877: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-885f-bbcf-0000000009ff] 11286 1726773099.47882: sending task result for task 0affffe7-6841-885f-bbcf-0000000009ff 11286 1726773099.47917: done sending task result for task 0affffe7-6841-885f-bbcf-0000000009ff 11286 1726773099.47921: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "sysctl": { "fs.epoll.max_user_watches": "785592", "fs.file-max": "400001", "vm.max_map_count": "65530" }, "sysfs": { "/sys/class/net/lo/mtu": "60666", "/sys/fs/selinux/avc/cache_threshold": "256", "/sys/kernel/debug/x86/ibrs_enabled": "0", "/sys/kernel/debug/x86/pti_enabled": "0", "/sys/kernel/debug/x86/retp_enabled": "0" } } } 8240 1726773099.48081: no more pending results, returning what we have 8240 1726773099.48087: results queue empty 8240 1726773099.48088: checking for any_errors_fatal 8240 1726773099.48095: done checking for any_errors_fatal 8240 1726773099.48095: checking for max_fail_percentage 8240 1726773099.48097: done checking for max_fail_percentage 8240 1726773099.48098: checking to see if all hosts have failed and the running result is not ok 8240 1726773099.48099: done checking to see if all hosts have failed 8240 1726773099.48099: getting the remaining hosts for this loop 8240 1726773099.48102: done getting the remaining hosts for this loop 8240 1726773099.48106: getting the next task for host managed_node2 8240 1726773099.48113: done getting next task for host managed_node2 8240 1726773099.48116: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8240 1726773099.48118: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773099.48129: getting variables 8240 1726773099.48130: in VariableManager get_vars() 8240 1726773099.48164: Calling all_inventory to load vars for managed_node2 8240 1726773099.48166: Calling groups_inventory to load vars for managed_node2 8240 1726773099.48168: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773099.48178: Calling all_plugins_play to load vars for managed_node2 8240 1726773099.48180: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773099.48182: Calling groups_plugins_play to load vars for managed_node2 8240 1726773099.48337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773099.48459: done with get_vars() 8240 1726773099.48466: done getting variables 8240 1726773099.48514: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:39 -0400 (0:00:00.333) 0:01:18.129 **** 8240 1726773099.48537: entering _queue_task() for managed_node2/template 8240 1726773099.48706: worker is 1 (out of 1 available) 8240 1726773099.48721: exiting _queue_task() for managed_node2/template 8240 1726773099.48736: done queuing things up, now waiting for results queue to drain 8240 1726773099.48737: waiting for pending results... 11297 1726773099.48864: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11297 1726773099.48979: in run() - task 0affffe7-6841-885f-bbcf-000000000a00 11297 1726773099.48997: variable 'ansible_search_path' from source: unknown 11297 1726773099.49001: variable 'ansible_search_path' from source: unknown 11297 1726773099.49029: calling self._execute() 11297 1726773099.49105: variable 'ansible_host' from source: host vars for 'managed_node2' 11297 1726773099.49114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11297 1726773099.49123: variable 'omit' from source: magic vars 11297 1726773099.49205: variable 'omit' from source: magic vars 11297 1726773099.49241: variable 'omit' from source: magic vars 11297 1726773099.49479: variable '__kernel_settings_profile_src' from source: role '' all vars 11297 1726773099.49490: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11297 1726773099.49546: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11297 1726773099.49567: variable '__kernel_settings_profile_filename' from source: role '' all vars 11297 1726773099.49616: variable '__kernel_settings_profile_filename' from source: role '' all vars 11297 1726773099.49665: variable '__kernel_settings_profile_dir' from source: role '' all vars 11297 1726773099.49731: variable '__kernel_settings_profile_parent' from source: set_fact 11297 1726773099.49739: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11297 1726773099.49764: variable 'omit' from source: magic vars 11297 1726773099.49801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11297 1726773099.49829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11297 1726773099.49848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11297 1726773099.49862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11297 1726773099.49873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11297 1726773099.49899: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11297 1726773099.49906: variable 'ansible_host' from source: host vars for 'managed_node2' 11297 1726773099.49911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11297 1726773099.49981: Set connection var ansible_pipelining to False 11297 1726773099.49989: Set connection var ansible_timeout to 10 11297 1726773099.49997: Set connection var ansible_module_compression to ZIP_DEFLATED 11297 1726773099.50000: Set connection var ansible_shell_type to sh 11297 1726773099.50006: Set connection var ansible_shell_executable to /bin/sh 11297 1726773099.50011: Set connection var ansible_connection to ssh 11297 1726773099.50028: variable 'ansible_shell_executable' from source: unknown 11297 1726773099.50032: variable 'ansible_connection' from source: unknown 11297 1726773099.50036: variable 'ansible_module_compression' from source: unknown 11297 1726773099.50039: variable 'ansible_shell_type' from source: unknown 11297 1726773099.50042: variable 'ansible_shell_executable' from source: unknown 11297 1726773099.50044: variable 'ansible_host' from source: host vars for 'managed_node2' 11297 1726773099.50047: variable 'ansible_pipelining' from source: unknown 11297 1726773099.50048: variable 'ansible_timeout' from source: unknown 11297 1726773099.50050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11297 1726773099.50144: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11297 1726773099.50157: variable 'omit' from source: magic vars 11297 1726773099.50164: starting attempt loop 11297 1726773099.50167: running the handler 11297 1726773099.50178: _low_level_execute_command(): starting 11297 1726773099.50187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11297 1726773099.52505: stdout chunk (state=2): >>>/root <<< 11297 1726773099.52629: stderr chunk (state=3): >>><<< 11297 1726773099.52638: stdout chunk (state=3): >>><<< 11297 1726773099.52657: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11297 1726773099.52671: _low_level_execute_command(): starting 11297 1726773099.52677: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947 `" && echo ansible-tmp-1726773099.5266585-11297-190258746927947="` echo /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947 `" ) && sleep 0' 11297 1726773099.55253: stdout chunk (state=2): >>>ansible-tmp-1726773099.5266585-11297-190258746927947=/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947 <<< 11297 1726773099.55388: stderr chunk (state=3): >>><<< 11297 1726773099.55396: stdout chunk (state=3): >>><<< 11297 1726773099.55413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773099.5266585-11297-190258746927947=/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947 , stderr= 11297 1726773099.55430: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11297 1726773099.55448: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11297 1726773099.55469: variable 'ansible_search_path' from source: unknown 11297 1726773099.56055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11297 1726773099.57482: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11297 1726773099.57537: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11297 1726773099.57566: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11297 1726773099.57595: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11297 1726773099.57618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11297 1726773099.57803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11297 1726773099.57824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11297 1726773099.57845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11297 1726773099.57872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11297 1726773099.57884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11297 1726773099.58113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11297 1726773099.58132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11297 1726773099.58150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11297 1726773099.58176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11297 1726773099.58190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11297 1726773099.58435: variable 'ansible_managed' from source: unknown 11297 1726773099.58443: variable '__sections' from source: task vars 11297 1726773099.58530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11297 1726773099.58549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11297 1726773099.58567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11297 1726773099.58595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11297 1726773099.58609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11297 1726773099.58677: variable 'kernel_settings_sysctl' from source: include params 11297 1726773099.58688: variable '__kernel_settings_state_empty' from source: role '' all vars 11297 1726773099.58693: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11297 1726773099.58723: variable '__sysctl_old' from source: task vars 11297 1726773099.58767: variable '__sysctl_old' from source: task vars 11297 1726773099.58909: variable 'kernel_settings_purge' from source: role '' defaults 11297 1726773099.58916: variable 'kernel_settings_sysctl' from source: include params 11297 1726773099.58923: variable '__kernel_settings_state_empty' from source: role '' all vars 11297 1726773099.58927: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11297 1726773099.58932: variable '__kernel_settings_profile_contents' from source: set_fact 11297 1726773099.59058: variable 'kernel_settings_sysfs' from source: include params 11297 1726773099.59066: variable '__kernel_settings_state_empty' from source: role '' all vars 11297 1726773099.59072: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11297 1726773099.59083: variable '__sysfs_old' from source: task vars 11297 1726773099.59129: variable '__sysfs_old' from source: task vars 11297 1726773099.59262: variable 'kernel_settings_purge' from source: role '' defaults 11297 1726773099.59269: variable 'kernel_settings_sysfs' from source: include params 11297 1726773099.59275: variable '__kernel_settings_state_empty' from source: role '' all vars 11297 1726773099.59280: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11297 1726773099.59286: variable '__kernel_settings_profile_contents' from source: set_fact 11297 1726773099.59304: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11297 1726773099.59313: variable '__systemd_old' from source: task vars 11297 1726773099.59353: variable '__systemd_old' from source: task vars 11297 1726773099.59487: variable 'kernel_settings_purge' from source: role '' defaults 11297 1726773099.59493: variable 'kernel_settings_systemd_cpu_affinity' from source: role '' defaults 11297 1726773099.59498: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.59506: variable '__kernel_settings_profile_contents' from source: set_fact 11297 1726773099.59522: variable 'kernel_settings_transparent_hugepages' from source: include params 11297 1726773099.59563: variable 'kernel_settings_transparent_hugepages' from source: include params 11297 1726773099.59573: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11297 1726773099.59579: variable '__trans_huge_old' from source: task vars 11297 1726773099.59621: variable '__trans_huge_old' from source: task vars 11297 1726773099.59748: variable 'kernel_settings_purge' from source: role '' defaults 11297 1726773099.59755: variable 'kernel_settings_transparent_hugepages' from source: include params 11297 1726773099.59760: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.59766: variable '__kernel_settings_profile_contents' from source: set_fact 11297 1726773099.59777: variable '__trans_defrag_old' from source: task vars 11297 1726773099.59820: variable '__trans_defrag_old' from source: task vars 11297 1726773099.59946: variable 'kernel_settings_purge' from source: role '' defaults 11297 1726773099.59953: variable 'kernel_settings_transparent_hugepages_defrag' from source: role '' defaults 11297 1726773099.59958: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.59963: variable '__kernel_settings_profile_contents' from source: set_fact 11297 1726773099.59981: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.59993: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.60004: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.60018: variable '__kernel_settings_state_absent' from source: role '' all vars 11297 1726773099.60458: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11297 1726773099.60517: variable 'ansible_module_compression' from source: unknown 11297 1726773099.60558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11297 1726773099.60593: variable 'ansible_facts' from source: unknown 11297 1726773099.60661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_stat.py 11297 1726773099.60755: Sending initial data 11297 1726773099.60762: Sent initial data (152 bytes) 11297 1726773099.63357: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpjgznljro /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_stat.py <<< 11297 1726773099.64469: stderr chunk (state=3): >>><<< 11297 1726773099.64478: stdout chunk (state=3): >>><<< 11297 1726773099.64503: done transferring module to remote 11297 1726773099.64515: _low_level_execute_command(): starting 11297 1726773099.64520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/ /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_stat.py && sleep 0' 11297 1726773099.66905: stderr chunk (state=2): >>><<< 11297 1726773099.66914: stdout chunk (state=2): >>><<< 11297 1726773099.66930: _low_level_execute_command() done: rc=0, stdout=, stderr= 11297 1726773099.66935: _low_level_execute_command(): starting 11297 1726773099.66940: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_stat.py && sleep 0' 11297 1726773099.83145: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189476, "dev": 51713, "nlink": 1, "atime": 1726773090.1110318, "mtime": 1726773089.3130238, "ctime": 1726773089.5560262, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "2237378470", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11297 1726773099.84356: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11297 1726773099.84410: stderr chunk (state=3): >>><<< 11297 1726773099.84417: stdout chunk (state=3): >>><<< 11297 1726773099.84433: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 372, "inode": 155189476, "dev": 51713, "nlink": 1, "atime": 1726773090.1110318, "mtime": 1726773089.3130238, "ctime": 1726773089.5560262, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3107bf46f5c007ef178305bb243dd11664f9bf35", "mimetype": "text/plain", "charset": "us-ascii", "version": "2237378470", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11297 1726773099.84469: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11297 1726773099.84557: Sending initial data 11297 1726773099.84564: Sent initial data (160 bytes) 11297 1726773099.87124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpca5pt2kv/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source <<< 11297 1726773099.87494: stderr chunk (state=3): >>><<< 11297 1726773099.87505: stdout chunk (state=3): >>><<< 11297 1726773099.87521: _low_level_execute_command(): starting 11297 1726773099.87527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/ /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source && sleep 0' 11297 1726773099.89852: stderr chunk (state=2): >>><<< 11297 1726773099.89861: stdout chunk (state=2): >>><<< 11297 1726773099.89876: _low_level_execute_command() done: rc=0, stdout=, stderr= 11297 1726773099.89898: variable 'ansible_module_compression' from source: unknown 11297 1726773099.89936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11297 1726773099.89954: variable 'ansible_facts' from source: unknown 11297 1726773099.90013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_copy.py 11297 1726773099.90100: Sending initial data 11297 1726773099.90110: Sent initial data (152 bytes) 11297 1726773099.92602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmplwpwmyrx /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_copy.py <<< 11297 1726773099.93716: stderr chunk (state=3): >>><<< 11297 1726773099.93724: stdout chunk (state=3): >>><<< 11297 1726773099.93743: done transferring module to remote 11297 1726773099.93751: _low_level_execute_command(): starting 11297 1726773099.93756: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/ /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_copy.py && sleep 0' 11297 1726773099.96102: stderr chunk (state=2): >>><<< 11297 1726773099.96112: stdout chunk (state=2): >>><<< 11297 1726773099.96126: _low_level_execute_command() done: rc=0, stdout=, stderr= 11297 1726773099.96130: _low_level_execute_command(): starting 11297 1726773099.96136: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/AnsiballZ_copy.py && sleep 0' 11297 1726773100.12493: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11297 1726773100.13645: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11297 1726773100.13695: stderr chunk (state=3): >>><<< 11297 1726773100.13701: stdout chunk (state=3): >>><<< 11297 1726773100.13718: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source", "md5sum": "394928e588644c456053f3dec5f7c2ba", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11297 1726773100.13743: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': '0b586509c0bdce12a2dde058e3374dab88cf7f2c', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11297 1726773100.13771: _low_level_execute_command(): starting 11297 1726773100.13778: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/ > /dev/null 2>&1 && sleep 0' 11297 1726773100.16194: stderr chunk (state=2): >>><<< 11297 1726773100.16203: stdout chunk (state=2): >>><<< 11297 1726773100.16218: _low_level_execute_command() done: rc=0, stdout=, stderr= 11297 1726773100.16228: handler run complete 11297 1726773100.16248: attempt loop complete, returning result 11297 1726773100.16251: _execute() done 11297 1726773100.16254: dumping result to json 11297 1726773100.16260: done dumping result, returning 11297 1726773100.16268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-885f-bbcf-000000000a00] 11297 1726773100.16273: sending task result for task 0affffe7-6841-885f-bbcf-000000000a00 11297 1726773100.16321: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a00 11297 1726773100.16325: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "394928e588644c456053f3dec5f7c2ba", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 121, "src": "/root/.ansible/tmp/ansible-tmp-1726773099.5266585-11297-190258746927947/source", "state": "file", "uid": 0 } 8240 1726773100.16572: no more pending results, returning what we have 8240 1726773100.16576: results queue empty 8240 1726773100.16577: checking for any_errors_fatal 8240 1726773100.16583: done checking for any_errors_fatal 8240 1726773100.16584: checking for max_fail_percentage 8240 1726773100.16586: done checking for max_fail_percentage 8240 1726773100.16587: checking to see if all hosts have failed and the running result is not ok 8240 1726773100.16588: done checking to see if all hosts have failed 8240 1726773100.16588: getting the remaining hosts for this loop 8240 1726773100.16589: done getting the remaining hosts for this loop 8240 1726773100.16592: getting the next task for host managed_node2 8240 1726773100.16597: done getting next task for host managed_node2 8240 1726773100.16599: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8240 1726773100.16603: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773100.16611: getting variables 8240 1726773100.16612: in VariableManager get_vars() 8240 1726773100.16638: Calling all_inventory to load vars for managed_node2 8240 1726773100.16640: Calling groups_inventory to load vars for managed_node2 8240 1726773100.16641: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773100.16649: Calling all_plugins_play to load vars for managed_node2 8240 1726773100.16651: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773100.16652: Calling groups_plugins_play to load vars for managed_node2 8240 1726773100.16763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773100.16884: done with get_vars() 8240 1726773100.16894: done getting variables 8240 1726773100.16939: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:40 -0400 (0:00:00.684) 0:01:18.813 **** 8240 1726773100.16962: entering _queue_task() for managed_node2/service 8240 1726773100.17139: worker is 1 (out of 1 available) 8240 1726773100.17155: exiting _queue_task() for managed_node2/service 8240 1726773100.17169: done queuing things up, now waiting for results queue to drain 8240 1726773100.17171: waiting for pending results... 11312 1726773100.17299: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11312 1726773100.17412: in run() - task 0affffe7-6841-885f-bbcf-000000000a01 11312 1726773100.17428: variable 'ansible_search_path' from source: unknown 11312 1726773100.17432: variable 'ansible_search_path' from source: unknown 11312 1726773100.17466: variable '__kernel_settings_services' from source: include_vars 11312 1726773100.17763: variable '__kernel_settings_services' from source: include_vars 11312 1726773100.17931: variable 'omit' from source: magic vars 11312 1726773100.18037: variable 'ansible_host' from source: host vars for 'managed_node2' 11312 1726773100.18048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11312 1726773100.18058: variable 'omit' from source: magic vars 11312 1726773100.18327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11312 1726773100.18558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11312 1726773100.18606: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11312 1726773100.18638: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11312 1726773100.18661: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11312 1726773100.18739: variable '__kernel_settings_register_profile' from source: set_fact 11312 1726773100.18750: variable '__kernel_settings_register_mode' from source: set_fact 11312 1726773100.18766: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 11312 1726773100.18769: when evaluation is False, skipping this task 11312 1726773100.18794: variable 'item' from source: unknown 11312 1726773100.18839: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 11312 1726773100.18865: dumping result to json 11312 1726773100.18869: done dumping result, returning 11312 1726773100.18873: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-885f-bbcf-000000000a01] 11312 1726773100.18877: sending task result for task 0affffe7-6841-885f-bbcf-000000000a01 11312 1726773100.18897: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a01 11312 1726773100.18900: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8240 1726773100.19112: no more pending results, returning what we have 8240 1726773100.19114: results queue empty 8240 1726773100.19115: checking for any_errors_fatal 8240 1726773100.19124: done checking for any_errors_fatal 8240 1726773100.19124: checking for max_fail_percentage 8240 1726773100.19125: done checking for max_fail_percentage 8240 1726773100.19126: checking to see if all hosts have failed and the running result is not ok 8240 1726773100.19127: done checking to see if all hosts have failed 8240 1726773100.19127: getting the remaining hosts for this loop 8240 1726773100.19128: done getting the remaining hosts for this loop 8240 1726773100.19130: getting the next task for host managed_node2 8240 1726773100.19135: done getting next task for host managed_node2 8240 1726773100.19137: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8240 1726773100.19139: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773100.19150: getting variables 8240 1726773100.19151: in VariableManager get_vars() 8240 1726773100.19177: Calling all_inventory to load vars for managed_node2 8240 1726773100.19179: Calling groups_inventory to load vars for managed_node2 8240 1726773100.19180: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773100.19189: Calling all_plugins_play to load vars for managed_node2 8240 1726773100.19191: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773100.19193: Calling groups_plugins_play to load vars for managed_node2 8240 1726773100.19298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773100.19418: done with get_vars() 8240 1726773100.19426: done getting variables 8240 1726773100.19467: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:40 -0400 (0:00:00.025) 0:01:18.838 **** 8240 1726773100.19492: entering _queue_task() for managed_node2/command 8240 1726773100.19654: worker is 1 (out of 1 available) 8240 1726773100.19668: exiting _queue_task() for managed_node2/command 8240 1726773100.19681: done queuing things up, now waiting for results queue to drain 8240 1726773100.19682: waiting for pending results... 11316 1726773100.19811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11316 1726773100.19920: in run() - task 0affffe7-6841-885f-bbcf-000000000a02 11316 1726773100.19937: variable 'ansible_search_path' from source: unknown 11316 1726773100.19941: variable 'ansible_search_path' from source: unknown 11316 1726773100.19968: calling self._execute() 11316 1726773100.20039: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726773100.20048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726773100.20056: variable 'omit' from source: magic vars 11316 1726773100.20384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11316 1726773100.20652: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11316 1726773100.20688: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11316 1726773100.20714: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11316 1726773100.20738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11316 1726773100.20820: variable '__kernel_settings_register_profile' from source: set_fact 11316 1726773100.20844: Evaluated conditional (not __kernel_settings_register_profile is changed): True 11316 1726773100.20934: variable '__kernel_settings_register_mode' from source: set_fact 11316 1726773100.20945: Evaluated conditional (not __kernel_settings_register_mode is changed): True 11316 1726773100.21021: variable '__kernel_settings_register_apply' from source: set_fact 11316 1726773100.21032: Evaluated conditional (__kernel_settings_register_apply is changed): True 11316 1726773100.21039: variable 'omit' from source: magic vars 11316 1726773100.21068: variable 'omit' from source: magic vars 11316 1726773100.21151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11316 1726773100.22536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11316 1726773100.22589: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11316 1726773100.22617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11316 1726773100.22643: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11316 1726773100.22662: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11316 1726773100.22717: variable '__kernel_settings_active_profile' from source: set_fact 11316 1726773100.22744: variable 'omit' from source: magic vars 11316 1726773100.22766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11316 1726773100.22790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11316 1726773100.22808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11316 1726773100.22823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11316 1726773100.22833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11316 1726773100.22857: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11316 1726773100.22862: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726773100.22866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726773100.22934: Set connection var ansible_pipelining to False 11316 1726773100.22941: Set connection var ansible_timeout to 10 11316 1726773100.22949: Set connection var ansible_module_compression to ZIP_DEFLATED 11316 1726773100.22952: Set connection var ansible_shell_type to sh 11316 1726773100.22957: Set connection var ansible_shell_executable to /bin/sh 11316 1726773100.22962: Set connection var ansible_connection to ssh 11316 1726773100.22979: variable 'ansible_shell_executable' from source: unknown 11316 1726773100.22983: variable 'ansible_connection' from source: unknown 11316 1726773100.22987: variable 'ansible_module_compression' from source: unknown 11316 1726773100.22991: variable 'ansible_shell_type' from source: unknown 11316 1726773100.22994: variable 'ansible_shell_executable' from source: unknown 11316 1726773100.22998: variable 'ansible_host' from source: host vars for 'managed_node2' 11316 1726773100.23002: variable 'ansible_pipelining' from source: unknown 11316 1726773100.23005: variable 'ansible_timeout' from source: unknown 11316 1726773100.23009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11316 1726773100.23073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11316 1726773100.23083: variable 'omit' from source: magic vars 11316 1726773100.23091: starting attempt loop 11316 1726773100.23095: running the handler 11316 1726773100.23107: _low_level_execute_command(): starting 11316 1726773100.23113: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11316 1726773100.25425: stdout chunk (state=2): >>>/root <<< 11316 1726773100.25548: stderr chunk (state=3): >>><<< 11316 1726773100.25554: stdout chunk (state=3): >>><<< 11316 1726773100.25573: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11316 1726773100.25588: _low_level_execute_command(): starting 11316 1726773100.25594: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327 `" && echo ansible-tmp-1726773100.2558105-11316-83871178383327="` echo /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327 `" ) && sleep 0' 11316 1726773100.28103: stdout chunk (state=2): >>>ansible-tmp-1726773100.2558105-11316-83871178383327=/root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327 <<< 11316 1726773100.28237: stderr chunk (state=3): >>><<< 11316 1726773100.28244: stdout chunk (state=3): >>><<< 11316 1726773100.28259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773100.2558105-11316-83871178383327=/root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327 , stderr= 11316 1726773100.28282: variable 'ansible_module_compression' from source: unknown 11316 1726773100.28324: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11316 1726773100.28352: variable 'ansible_facts' from source: unknown 11316 1726773100.28427: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/AnsiballZ_command.py 11316 1726773100.28532: Sending initial data 11316 1726773100.28539: Sent initial data (154 bytes) 11316 1726773100.31058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpebebv4vm /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/AnsiballZ_command.py <<< 11316 1726773100.32163: stderr chunk (state=3): >>><<< 11316 1726773100.32173: stdout chunk (state=3): >>><<< 11316 1726773100.32194: done transferring module to remote 11316 1726773100.32209: _low_level_execute_command(): starting 11316 1726773100.32215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/ /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/AnsiballZ_command.py && sleep 0' 11316 1726773100.34569: stderr chunk (state=2): >>><<< 11316 1726773100.34577: stdout chunk (state=2): >>><<< 11316 1726773100.34592: _low_level_execute_command() done: rc=0, stdout=, stderr= 11316 1726773100.34596: _low_level_execute_command(): starting 11316 1726773100.34601: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/AnsiballZ_command.py && sleep 0' 11316 1726773101.63918: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:40.495603", "end": "2024-09-19 15:11:41.636956", "delta": "0:00:01.141353", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11316 1726773101.65118: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11316 1726773101.65165: stderr chunk (state=3): >>><<< 11316 1726773101.65174: stdout chunk (state=3): >>><<< 11316 1726773101.65191: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:40.495603", "end": "2024-09-19 15:11:41.636956", "delta": "0:00:01.141353", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11316 1726773101.65219: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11316 1726773101.65229: _low_level_execute_command(): starting 11316 1726773101.65235: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773100.2558105-11316-83871178383327/ > /dev/null 2>&1 && sleep 0' 11316 1726773101.67665: stderr chunk (state=2): >>><<< 11316 1726773101.67675: stdout chunk (state=2): >>><<< 11316 1726773101.67692: _low_level_execute_command() done: rc=0, stdout=, stderr= 11316 1726773101.67699: handler run complete 11316 1726773101.67717: Evaluated conditional (True): True 11316 1726773101.67726: attempt loop complete, returning result 11316 1726773101.67729: _execute() done 11316 1726773101.67732: dumping result to json 11316 1726773101.67738: done dumping result, returning 11316 1726773101.67746: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-885f-bbcf-000000000a02] 11316 1726773101.67751: sending task result for task 0affffe7-6841-885f-bbcf-000000000a02 11316 1726773101.67780: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a02 11316 1726773101.67783: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.141353", "end": "2024-09-19 15:11:41.636956", "rc": 0, "start": "2024-09-19 15:11:40.495603" } 8240 1726773101.67958: no more pending results, returning what we have 8240 1726773101.67962: results queue empty 8240 1726773101.67963: checking for any_errors_fatal 8240 1726773101.67970: done checking for any_errors_fatal 8240 1726773101.67971: checking for max_fail_percentage 8240 1726773101.67973: done checking for max_fail_percentage 8240 1726773101.67974: checking to see if all hosts have failed and the running result is not ok 8240 1726773101.67975: done checking to see if all hosts have failed 8240 1726773101.67975: getting the remaining hosts for this loop 8240 1726773101.67977: done getting the remaining hosts for this loop 8240 1726773101.67980: getting the next task for host managed_node2 8240 1726773101.67988: done getting next task for host managed_node2 8240 1726773101.67992: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8240 1726773101.67994: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773101.68007: getting variables 8240 1726773101.68009: in VariableManager get_vars() 8240 1726773101.68043: Calling all_inventory to load vars for managed_node2 8240 1726773101.68046: Calling groups_inventory to load vars for managed_node2 8240 1726773101.68048: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773101.68057: Calling all_plugins_play to load vars for managed_node2 8240 1726773101.68059: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773101.68060: Calling groups_plugins_play to load vars for managed_node2 8240 1726773101.68242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773101.68355: done with get_vars() 8240 1726773101.68363: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:41 -0400 (0:00:01.489) 0:01:20.328 **** 8240 1726773101.68434: entering _queue_task() for managed_node2/include_tasks 8240 1726773101.68599: worker is 1 (out of 1 available) 8240 1726773101.68617: exiting _queue_task() for managed_node2/include_tasks 8240 1726773101.68630: done queuing things up, now waiting for results queue to drain 8240 1726773101.68632: waiting for pending results... 11327 1726773101.68761: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11327 1726773101.68889: in run() - task 0affffe7-6841-885f-bbcf-000000000a03 11327 1726773101.68908: variable 'ansible_search_path' from source: unknown 11327 1726773101.68912: variable 'ansible_search_path' from source: unknown 11327 1726773101.68940: calling self._execute() 11327 1726773101.69011: variable 'ansible_host' from source: host vars for 'managed_node2' 11327 1726773101.69020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11327 1726773101.69029: variable 'omit' from source: magic vars 11327 1726773101.69356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11327 1726773101.69538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11327 1726773101.69576: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11327 1726773101.69606: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11327 1726773101.69634: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11327 1726773101.69715: variable '__kernel_settings_register_apply' from source: set_fact 11327 1726773101.69739: Evaluated conditional (__kernel_settings_register_apply is changed): True 11327 1726773101.69746: _execute() done 11327 1726773101.69750: dumping result to json 11327 1726773101.69754: done dumping result, returning 11327 1726773101.69760: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-885f-bbcf-000000000a03] 11327 1726773101.69766: sending task result for task 0affffe7-6841-885f-bbcf-000000000a03 11327 1726773101.69790: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a03 11327 1726773101.69794: WORKER PROCESS EXITING 8240 1726773101.69905: no more pending results, returning what we have 8240 1726773101.69910: in VariableManager get_vars() 8240 1726773101.69949: Calling all_inventory to load vars for managed_node2 8240 1726773101.69952: Calling groups_inventory to load vars for managed_node2 8240 1726773101.69954: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773101.69964: Calling all_plugins_play to load vars for managed_node2 8240 1726773101.69966: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773101.69969: Calling groups_plugins_play to load vars for managed_node2 8240 1726773101.70094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773101.70211: done with get_vars() 8240 1726773101.70217: variable 'ansible_search_path' from source: unknown 8240 1726773101.70218: variable 'ansible_search_path' from source: unknown 8240 1726773101.70240: we have included files to process 8240 1726773101.70241: generating all_blocks data 8240 1726773101.70244: done generating all_blocks data 8240 1726773101.70248: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773101.70248: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773101.70249: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8240 1726773101.70517: done processing included file 8240 1726773101.70520: iterating over new_blocks loaded from include file 8240 1726773101.70520: in VariableManager get_vars() 8240 1726773101.70535: done with get_vars() 8240 1726773101.70536: filtering new block on tags 8240 1726773101.70567: done filtering new block on tags 8240 1726773101.70569: done iterating over new_blocks loaded from include file 8240 1726773101.70569: extending task lists for all hosts with included blocks 8240 1726773101.70975: done extending task lists 8240 1726773101.70976: done processing included files 8240 1726773101.70977: results queue empty 8240 1726773101.70977: checking for any_errors_fatal 8240 1726773101.70980: done checking for any_errors_fatal 8240 1726773101.70981: checking for max_fail_percentage 8240 1726773101.70981: done checking for max_fail_percentage 8240 1726773101.70982: checking to see if all hosts have failed and the running result is not ok 8240 1726773101.70982: done checking to see if all hosts have failed 8240 1726773101.70983: getting the remaining hosts for this loop 8240 1726773101.70983: done getting the remaining hosts for this loop 8240 1726773101.70987: getting the next task for host managed_node2 8240 1726773101.70991: done getting next task for host managed_node2 8240 1726773101.70992: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8240 1726773101.70994: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773101.71003: getting variables 8240 1726773101.71004: in VariableManager get_vars() 8240 1726773101.71013: Calling all_inventory to load vars for managed_node2 8240 1726773101.71014: Calling groups_inventory to load vars for managed_node2 8240 1726773101.71015: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773101.71019: Calling all_plugins_play to load vars for managed_node2 8240 1726773101.71020: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773101.71021: Calling groups_plugins_play to load vars for managed_node2 8240 1726773101.71098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773101.71210: done with get_vars() 8240 1726773101.71216: done getting variables 8240 1726773101.71241: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:41 -0400 (0:00:00.028) 0:01:20.356 **** 8240 1726773101.71263: entering _queue_task() for managed_node2/command 8240 1726773101.71434: worker is 1 (out of 1 available) 8240 1726773101.71447: exiting _queue_task() for managed_node2/command 8240 1726773101.71462: done queuing things up, now waiting for results queue to drain 8240 1726773101.71463: waiting for pending results... 11328 1726773101.71589: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11328 1726773101.71715: in run() - task 0affffe7-6841-885f-bbcf-000000000c42 11328 1726773101.71734: variable 'ansible_search_path' from source: unknown 11328 1726773101.71739: variable 'ansible_search_path' from source: unknown 11328 1726773101.71765: calling self._execute() 11328 1726773101.71831: variable 'ansible_host' from source: host vars for 'managed_node2' 11328 1726773101.71839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11328 1726773101.71848: variable 'omit' from source: magic vars 11328 1726773101.71922: variable 'omit' from source: magic vars 11328 1726773101.71969: variable 'omit' from source: magic vars 11328 1726773101.71994: variable 'omit' from source: magic vars 11328 1726773101.72029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11328 1726773101.72056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11328 1726773101.72074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11328 1726773101.72090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11328 1726773101.72101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11328 1726773101.72127: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11328 1726773101.72132: variable 'ansible_host' from source: host vars for 'managed_node2' 11328 1726773101.72136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11328 1726773101.72205: Set connection var ansible_pipelining to False 11328 1726773101.72212: Set connection var ansible_timeout to 10 11328 1726773101.72220: Set connection var ansible_module_compression to ZIP_DEFLATED 11328 1726773101.72224: Set connection var ansible_shell_type to sh 11328 1726773101.72229: Set connection var ansible_shell_executable to /bin/sh 11328 1726773101.72235: Set connection var ansible_connection to ssh 11328 1726773101.72251: variable 'ansible_shell_executable' from source: unknown 11328 1726773101.72255: variable 'ansible_connection' from source: unknown 11328 1726773101.72259: variable 'ansible_module_compression' from source: unknown 11328 1726773101.72262: variable 'ansible_shell_type' from source: unknown 11328 1726773101.72266: variable 'ansible_shell_executable' from source: unknown 11328 1726773101.72269: variable 'ansible_host' from source: host vars for 'managed_node2' 11328 1726773101.72273: variable 'ansible_pipelining' from source: unknown 11328 1726773101.72276: variable 'ansible_timeout' from source: unknown 11328 1726773101.72281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11328 1726773101.72375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11328 1726773101.72388: variable 'omit' from source: magic vars 11328 1726773101.72394: starting attempt loop 11328 1726773101.72398: running the handler 11328 1726773101.72411: _low_level_execute_command(): starting 11328 1726773101.72419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11328 1726773101.74767: stdout chunk (state=2): >>>/root <<< 11328 1726773101.74888: stderr chunk (state=3): >>><<< 11328 1726773101.74895: stdout chunk (state=3): >>><<< 11328 1726773101.74916: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11328 1726773101.74928: _low_level_execute_command(): starting 11328 1726773101.74934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340 `" && echo ansible-tmp-1726773101.7492347-11328-239763412778340="` echo /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340 `" ) && sleep 0' 11328 1726773101.77693: stdout chunk (state=2): >>>ansible-tmp-1726773101.7492347-11328-239763412778340=/root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340 <<< 11328 1726773101.77824: stderr chunk (state=3): >>><<< 11328 1726773101.77831: stdout chunk (state=3): >>><<< 11328 1726773101.77846: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773101.7492347-11328-239763412778340=/root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340 , stderr= 11328 1726773101.77871: variable 'ansible_module_compression' from source: unknown 11328 1726773101.77921: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11328 1726773101.77951: variable 'ansible_facts' from source: unknown 11328 1726773101.78028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/AnsiballZ_command.py 11328 1726773101.78130: Sending initial data 11328 1726773101.78137: Sent initial data (155 bytes) 11328 1726773101.80663: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpo610r4ds /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/AnsiballZ_command.py <<< 11328 1726773101.81760: stderr chunk (state=3): >>><<< 11328 1726773101.81769: stdout chunk (state=3): >>><<< 11328 1726773101.81791: done transferring module to remote 11328 1726773101.81803: _low_level_execute_command(): starting 11328 1726773101.81809: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/ /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/AnsiballZ_command.py && sleep 0' 11328 1726773101.84188: stderr chunk (state=2): >>><<< 11328 1726773101.84198: stdout chunk (state=2): >>><<< 11328 1726773101.84213: _low_level_execute_command() done: rc=0, stdout=, stderr= 11328 1726773101.84218: _low_level_execute_command(): starting 11328 1726773101.84223: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/AnsiballZ_command.py && sleep 0' 11328 1726773102.10160: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:41.993284", "end": "2024-09-19 15:11:42.097036", "delta": "0:00:00.103752", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11328 1726773102.11137: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11328 1726773102.11188: stderr chunk (state=3): >>><<< 11328 1726773102.11194: stdout chunk (state=3): >>><<< 11328 1726773102.11211: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:41.993284", "end": "2024-09-19 15:11:42.097036", "delta": "0:00:00.103752", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11328 1726773102.11252: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11328 1726773102.11262: _low_level_execute_command(): starting 11328 1726773102.11270: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773101.7492347-11328-239763412778340/ > /dev/null 2>&1 && sleep 0' 11328 1726773102.13726: stderr chunk (state=2): >>><<< 11328 1726773102.13736: stdout chunk (state=2): >>><<< 11328 1726773102.13751: _low_level_execute_command() done: rc=0, stdout=, stderr= 11328 1726773102.13758: handler run complete 11328 1726773102.13777: Evaluated conditional (False): False 11328 1726773102.13788: attempt loop complete, returning result 11328 1726773102.13792: _execute() done 11328 1726773102.13795: dumping result to json 11328 1726773102.13801: done dumping result, returning 11328 1726773102.13809: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-885f-bbcf-000000000c42] 11328 1726773102.13815: sending task result for task 0affffe7-6841-885f-bbcf-000000000c42 11328 1726773102.13847: done sending task result for task 0affffe7-6841-885f-bbcf-000000000c42 11328 1726773102.13850: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.103752", "end": "2024-09-19 15:11:42.097036", "rc": 0, "start": "2024-09-19 15:11:41.993284" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773102.14034: no more pending results, returning what we have 8240 1726773102.14037: results queue empty 8240 1726773102.14038: checking for any_errors_fatal 8240 1726773102.14040: done checking for any_errors_fatal 8240 1726773102.14041: checking for max_fail_percentage 8240 1726773102.14042: done checking for max_fail_percentage 8240 1726773102.14043: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.14044: done checking to see if all hosts have failed 8240 1726773102.14045: getting the remaining hosts for this loop 8240 1726773102.14046: done getting the remaining hosts for this loop 8240 1726773102.14049: getting the next task for host managed_node2 8240 1726773102.14056: done getting next task for host managed_node2 8240 1726773102.14059: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8240 1726773102.14062: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.14072: getting variables 8240 1726773102.14074: in VariableManager get_vars() 8240 1726773102.14108: Calling all_inventory to load vars for managed_node2 8240 1726773102.14110: Calling groups_inventory to load vars for managed_node2 8240 1726773102.14111: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.14119: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.14121: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.14123: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.14233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.14381: done with get_vars() 8240 1726773102.14390: done getting variables 8240 1726773102.14435: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.431) 0:01:20.788 **** 8240 1726773102.14459: entering _queue_task() for managed_node2/shell 8240 1726773102.14627: worker is 1 (out of 1 available) 8240 1726773102.14640: exiting _queue_task() for managed_node2/shell 8240 1726773102.14653: done queuing things up, now waiting for results queue to drain 8240 1726773102.14655: waiting for pending results... 11336 1726773102.14780: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11336 1726773102.14911: in run() - task 0affffe7-6841-885f-bbcf-000000000c43 11336 1726773102.14927: variable 'ansible_search_path' from source: unknown 11336 1726773102.14931: variable 'ansible_search_path' from source: unknown 11336 1726773102.14958: calling self._execute() 11336 1726773102.15027: variable 'ansible_host' from source: host vars for 'managed_node2' 11336 1726773102.15036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11336 1726773102.15044: variable 'omit' from source: magic vars 11336 1726773102.15376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11336 1726773102.15558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11336 1726773102.15596: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11336 1726773102.15623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11336 1726773102.15649: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11336 1726773102.15729: variable '__kernel_settings_register_verify_values' from source: set_fact 11336 1726773102.15754: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11336 1726773102.15759: when evaluation is False, skipping this task 11336 1726773102.15763: _execute() done 11336 1726773102.15767: dumping result to json 11336 1726773102.15771: done dumping result, returning 11336 1726773102.15777: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-885f-bbcf-000000000c43] 11336 1726773102.15782: sending task result for task 0affffe7-6841-885f-bbcf-000000000c43 11336 1726773102.15807: done sending task result for task 0affffe7-6841-885f-bbcf-000000000c43 11336 1726773102.15811: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773102.15916: no more pending results, returning what we have 8240 1726773102.15919: results queue empty 8240 1726773102.15920: checking for any_errors_fatal 8240 1726773102.15928: done checking for any_errors_fatal 8240 1726773102.15928: checking for max_fail_percentage 8240 1726773102.15930: done checking for max_fail_percentage 8240 1726773102.15930: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.15931: done checking to see if all hosts have failed 8240 1726773102.15932: getting the remaining hosts for this loop 8240 1726773102.15933: done getting the remaining hosts for this loop 8240 1726773102.15936: getting the next task for host managed_node2 8240 1726773102.15943: done getting next task for host managed_node2 8240 1726773102.15946: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8240 1726773102.15949: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.15965: getting variables 8240 1726773102.15967: in VariableManager get_vars() 8240 1726773102.16003: Calling all_inventory to load vars for managed_node2 8240 1726773102.16006: Calling groups_inventory to load vars for managed_node2 8240 1726773102.16008: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.16015: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.16017: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.16019: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.16124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.16244: done with get_vars() 8240 1726773102.16251: done getting variables 8240 1726773102.16294: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.018) 0:01:20.807 **** 8240 1726773102.16321: entering _queue_task() for managed_node2/fail 8240 1726773102.16474: worker is 1 (out of 1 available) 8240 1726773102.16488: exiting _queue_task() for managed_node2/fail 8240 1726773102.16504: done queuing things up, now waiting for results queue to drain 8240 1726773102.16506: waiting for pending results... 11337 1726773102.16625: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11337 1726773102.16749: in run() - task 0affffe7-6841-885f-bbcf-000000000c44 11337 1726773102.16765: variable 'ansible_search_path' from source: unknown 11337 1726773102.16769: variable 'ansible_search_path' from source: unknown 11337 1726773102.16797: calling self._execute() 11337 1726773102.16862: variable 'ansible_host' from source: host vars for 'managed_node2' 11337 1726773102.16870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11337 1726773102.16880: variable 'omit' from source: magic vars 11337 1726773102.17206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11337 1726773102.17436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11337 1726773102.17470: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11337 1726773102.17497: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11337 1726773102.17523: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11337 1726773102.17600: variable '__kernel_settings_register_verify_values' from source: set_fact 11337 1726773102.17623: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11337 1726773102.17628: when evaluation is False, skipping this task 11337 1726773102.17632: _execute() done 11337 1726773102.17636: dumping result to json 11337 1726773102.17640: done dumping result, returning 11337 1726773102.17646: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-885f-bbcf-000000000c44] 11337 1726773102.17652: sending task result for task 0affffe7-6841-885f-bbcf-000000000c44 11337 1726773102.17675: done sending task result for task 0affffe7-6841-885f-bbcf-000000000c44 11337 1726773102.17679: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773102.17788: no more pending results, returning what we have 8240 1726773102.17792: results queue empty 8240 1726773102.17793: checking for any_errors_fatal 8240 1726773102.17797: done checking for any_errors_fatal 8240 1726773102.17798: checking for max_fail_percentage 8240 1726773102.17800: done checking for max_fail_percentage 8240 1726773102.17800: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.17803: done checking to see if all hosts have failed 8240 1726773102.17804: getting the remaining hosts for this loop 8240 1726773102.17805: done getting the remaining hosts for this loop 8240 1726773102.17808: getting the next task for host managed_node2 8240 1726773102.17816: done getting next task for host managed_node2 8240 1726773102.17819: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8240 1726773102.17822: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.17838: getting variables 8240 1726773102.17839: in VariableManager get_vars() 8240 1726773102.17868: Calling all_inventory to load vars for managed_node2 8240 1726773102.17871: Calling groups_inventory to load vars for managed_node2 8240 1726773102.17872: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.17880: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.17882: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.17883: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.17988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.18141: done with get_vars() 8240 1726773102.18147: done getting variables 8240 1726773102.18188: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.018) 0:01:20.825 **** 8240 1726773102.18213: entering _queue_task() for managed_node2/set_fact 8240 1726773102.18362: worker is 1 (out of 1 available) 8240 1726773102.18376: exiting _queue_task() for managed_node2/set_fact 8240 1726773102.18391: done queuing things up, now waiting for results queue to drain 8240 1726773102.18393: waiting for pending results... 11338 1726773102.18515: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11338 1726773102.18626: in run() - task 0affffe7-6841-885f-bbcf-000000000a04 11338 1726773102.18642: variable 'ansible_search_path' from source: unknown 11338 1726773102.18646: variable 'ansible_search_path' from source: unknown 11338 1726773102.18672: calling self._execute() 11338 1726773102.18740: variable 'ansible_host' from source: host vars for 'managed_node2' 11338 1726773102.18748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11338 1726773102.18757: variable 'omit' from source: magic vars 11338 1726773102.18833: variable 'omit' from source: magic vars 11338 1726773102.18866: variable 'omit' from source: magic vars 11338 1726773102.18890: variable 'omit' from source: magic vars 11338 1726773102.18922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11338 1726773102.18951: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11338 1726773102.18970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11338 1726773102.18987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11338 1726773102.18999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11338 1726773102.19022: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11338 1726773102.19026: variable 'ansible_host' from source: host vars for 'managed_node2' 11338 1726773102.19028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11338 1726773102.19098: Set connection var ansible_pipelining to False 11338 1726773102.19104: Set connection var ansible_timeout to 10 11338 1726773102.19109: Set connection var ansible_module_compression to ZIP_DEFLATED 11338 1726773102.19111: Set connection var ansible_shell_type to sh 11338 1726773102.19114: Set connection var ansible_shell_executable to /bin/sh 11338 1726773102.19117: Set connection var ansible_connection to ssh 11338 1726773102.19129: variable 'ansible_shell_executable' from source: unknown 11338 1726773102.19132: variable 'ansible_connection' from source: unknown 11338 1726773102.19134: variable 'ansible_module_compression' from source: unknown 11338 1726773102.19136: variable 'ansible_shell_type' from source: unknown 11338 1726773102.19137: variable 'ansible_shell_executable' from source: unknown 11338 1726773102.19139: variable 'ansible_host' from source: host vars for 'managed_node2' 11338 1726773102.19141: variable 'ansible_pipelining' from source: unknown 11338 1726773102.19143: variable 'ansible_timeout' from source: unknown 11338 1726773102.19145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11338 1726773102.19237: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11338 1726773102.19246: variable 'omit' from source: magic vars 11338 1726773102.19251: starting attempt loop 11338 1726773102.19253: running the handler 11338 1726773102.19261: handler run complete 11338 1726773102.19269: attempt loop complete, returning result 11338 1726773102.19272: _execute() done 11338 1726773102.19274: dumping result to json 11338 1726773102.19276: done dumping result, returning 11338 1726773102.19280: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000a04] 11338 1726773102.19287: sending task result for task 0affffe7-6841-885f-bbcf-000000000a04 11338 1726773102.19311: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a04 11338 1726773102.19314: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8240 1726773102.19649: no more pending results, returning what we have 8240 1726773102.19652: results queue empty 8240 1726773102.19653: checking for any_errors_fatal 8240 1726773102.19659: done checking for any_errors_fatal 8240 1726773102.19660: checking for max_fail_percentage 8240 1726773102.19661: done checking for max_fail_percentage 8240 1726773102.19662: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.19663: done checking to see if all hosts have failed 8240 1726773102.19664: getting the remaining hosts for this loop 8240 1726773102.19665: done getting the remaining hosts for this loop 8240 1726773102.19668: getting the next task for host managed_node2 8240 1726773102.19674: done getting next task for host managed_node2 8240 1726773102.19678: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8240 1726773102.19680: ^ state is: HOST STATE: block=2, task=45, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.19692: getting variables 8240 1726773102.19693: in VariableManager get_vars() 8240 1726773102.19726: Calling all_inventory to load vars for managed_node2 8240 1726773102.19729: Calling groups_inventory to load vars for managed_node2 8240 1726773102.19731: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.19740: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.19743: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.19746: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.19909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.20080: done with get_vars() 8240 1726773102.20090: done getting variables 8240 1726773102.20145: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.019) 0:01:20.845 **** 8240 1726773102.20175: entering _queue_task() for managed_node2/set_fact 8240 1726773102.20360: worker is 1 (out of 1 available) 8240 1726773102.20372: exiting _queue_task() for managed_node2/set_fact 8240 1726773102.20383: done queuing things up, now waiting for results queue to drain 8240 1726773102.20384: waiting for pending results... 11342 1726773102.20563: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11342 1726773102.20683: in run() - task 0affffe7-6841-885f-bbcf-000000000a05 11342 1726773102.20705: variable 'ansible_search_path' from source: unknown 11342 1726773102.20709: variable 'ansible_search_path' from source: unknown 11342 1726773102.20736: calling self._execute() 11342 1726773102.20803: variable 'ansible_host' from source: host vars for 'managed_node2' 11342 1726773102.20811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11342 1726773102.20820: variable 'omit' from source: magic vars 11342 1726773102.20899: variable 'omit' from source: magic vars 11342 1726773102.20935: variable 'omit' from source: magic vars 11342 1726773102.21209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11342 1726773102.21448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11342 1726773102.21482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11342 1726773102.21512: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11342 1726773102.21537: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11342 1726773102.21637: variable '__kernel_settings_register_profile' from source: set_fact 11342 1726773102.21651: variable '__kernel_settings_register_mode' from source: set_fact 11342 1726773102.21659: variable '__kernel_settings_register_apply' from source: set_fact 11342 1726773102.21697: variable 'omit' from source: magic vars 11342 1726773102.21722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11342 1726773102.21744: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11342 1726773102.21759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11342 1726773102.21773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11342 1726773102.21782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11342 1726773102.21809: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11342 1726773102.21814: variable 'ansible_host' from source: host vars for 'managed_node2' 11342 1726773102.21818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11342 1726773102.21881: Set connection var ansible_pipelining to False 11342 1726773102.21890: Set connection var ansible_timeout to 10 11342 1726773102.21898: Set connection var ansible_module_compression to ZIP_DEFLATED 11342 1726773102.21903: Set connection var ansible_shell_type to sh 11342 1726773102.21909: Set connection var ansible_shell_executable to /bin/sh 11342 1726773102.21914: Set connection var ansible_connection to ssh 11342 1726773102.21930: variable 'ansible_shell_executable' from source: unknown 11342 1726773102.21935: variable 'ansible_connection' from source: unknown 11342 1726773102.21938: variable 'ansible_module_compression' from source: unknown 11342 1726773102.21941: variable 'ansible_shell_type' from source: unknown 11342 1726773102.21944: variable 'ansible_shell_executable' from source: unknown 11342 1726773102.21948: variable 'ansible_host' from source: host vars for 'managed_node2' 11342 1726773102.21952: variable 'ansible_pipelining' from source: unknown 11342 1726773102.21955: variable 'ansible_timeout' from source: unknown 11342 1726773102.21959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11342 1726773102.22032: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11342 1726773102.22044: variable 'omit' from source: magic vars 11342 1726773102.22050: starting attempt loop 11342 1726773102.22053: running the handler 11342 1726773102.22063: handler run complete 11342 1726773102.22071: attempt loop complete, returning result 11342 1726773102.22074: _execute() done 11342 1726773102.22077: dumping result to json 11342 1726773102.22080: done dumping result, returning 11342 1726773102.22089: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-885f-bbcf-000000000a05] 11342 1726773102.22095: sending task result for task 0affffe7-6841-885f-bbcf-000000000a05 11342 1726773102.22117: done sending task result for task 0affffe7-6841-885f-bbcf-000000000a05 11342 1726773102.22120: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8240 1726773102.22237: no more pending results, returning what we have 8240 1726773102.22240: results queue empty 8240 1726773102.22241: checking for any_errors_fatal 8240 1726773102.22246: done checking for any_errors_fatal 8240 1726773102.22247: checking for max_fail_percentage 8240 1726773102.22248: done checking for max_fail_percentage 8240 1726773102.22249: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.22250: done checking to see if all hosts have failed 8240 1726773102.22250: getting the remaining hosts for this loop 8240 1726773102.22251: done getting the remaining hosts for this loop 8240 1726773102.22255: getting the next task for host managed_node2 8240 1726773102.22263: done getting next task for host managed_node2 8240 1726773102.22265: ^ task is: TASK: meta (role_complete) 8240 1726773102.22267: ^ state is: HOST STATE: block=2, task=46, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.22277: getting variables 8240 1726773102.22278: in VariableManager get_vars() 8240 1726773102.22312: Calling all_inventory to load vars for managed_node2 8240 1726773102.22315: Calling groups_inventory to load vars for managed_node2 8240 1726773102.22317: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.22326: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.22329: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.22331: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.22436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.22587: done with get_vars() 8240 1726773102.22593: done getting variables 8240 1726773102.22645: done queuing things up, now waiting for results queue to drain 8240 1726773102.22646: results queue empty 8240 1726773102.22646: checking for any_errors_fatal 8240 1726773102.22649: done checking for any_errors_fatal 8240 1726773102.22649: checking for max_fail_percentage 8240 1726773102.22650: done checking for max_fail_percentage 8240 1726773102.22654: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.22654: done checking to see if all hosts have failed 8240 1726773102.22654: getting the remaining hosts for this loop 8240 1726773102.22655: done getting the remaining hosts for this loop 8240 1726773102.22656: getting the next task for host managed_node2 8240 1726773102.22659: done getting next task for host managed_node2 8240 1726773102.22660: ^ task is: TASK: meta (flush_handlers) 8240 1726773102.22660: ^ state is: HOST STATE: block=2, task=47, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.22664: getting variables 8240 1726773102.22664: in VariableManager get_vars() 8240 1726773102.22671: Calling all_inventory to load vars for managed_node2 8240 1726773102.22673: Calling groups_inventory to load vars for managed_node2 8240 1726773102.22674: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.22676: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.22678: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.22679: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.22754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.22855: done with get_vars() 8240 1726773102.22861: done getting variables TASK [Force handlers] ********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:191 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.027) 0:01:20.872 **** 8240 1726773102.22905: in VariableManager get_vars() 8240 1726773102.22913: Calling all_inventory to load vars for managed_node2 8240 1726773102.22914: Calling groups_inventory to load vars for managed_node2 8240 1726773102.22915: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.22918: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.22919: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.22920: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.22995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.23093: done with get_vars() META: triggered running handlers for managed_node2 8240 1726773102.23103: done queuing things up, now waiting for results queue to drain 8240 1726773102.23104: results queue empty 8240 1726773102.23104: checking for any_errors_fatal 8240 1726773102.23106: done checking for any_errors_fatal 8240 1726773102.23106: checking for max_fail_percentage 8240 1726773102.23107: done checking for max_fail_percentage 8240 1726773102.23107: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.23107: done checking to see if all hosts have failed 8240 1726773102.23108: getting the remaining hosts for this loop 8240 1726773102.23108: done getting the remaining hosts for this loop 8240 1726773102.23109: getting the next task for host managed_node2 8240 1726773102.23113: done getting next task for host managed_node2 8240 1726773102.23114: ^ task is: TASK: Ensure kernel_settings_reboot_required is not set or is false 8240 1726773102.23114: ^ state is: HOST STATE: block=2, task=48, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.23116: getting variables 8240 1726773102.23117: in VariableManager get_vars() 8240 1726773102.23123: Calling all_inventory to load vars for managed_node2 8240 1726773102.23125: Calling groups_inventory to load vars for managed_node2 8240 1726773102.23126: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.23128: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.23130: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.23131: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.23377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.23475: done with get_vars() 8240 1726773102.23480: done getting variables 8240 1726773102.23506: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings_reboot_required is not set or is false] *********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:194 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.006) 0:01:20.879 **** 8240 1726773102.23520: entering _queue_task() for managed_node2/assert 8240 1726773102.23688: worker is 1 (out of 1 available) 8240 1726773102.23702: exiting _queue_task() for managed_node2/assert 8240 1726773102.23716: done queuing things up, now waiting for results queue to drain 8240 1726773102.23719: waiting for pending results... 11343 1726773102.23849: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false 11343 1726773102.23956: in run() - task 0affffe7-6841-885f-bbcf-000000000027 11343 1726773102.23975: variable 'ansible_search_path' from source: unknown 11343 1726773102.24006: calling self._execute() 11343 1726773102.24079: variable 'ansible_host' from source: host vars for 'managed_node2' 11343 1726773102.24089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11343 1726773102.24099: variable 'omit' from source: magic vars 11343 1726773102.24177: variable 'omit' from source: magic vars 11343 1726773102.24207: variable 'omit' from source: magic vars 11343 1726773102.24232: variable 'omit' from source: magic vars 11343 1726773102.24265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11343 1726773102.24294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11343 1726773102.24317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11343 1726773102.24332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11343 1726773102.24344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11343 1726773102.24368: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11343 1726773102.24374: variable 'ansible_host' from source: host vars for 'managed_node2' 11343 1726773102.24378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11343 1726773102.24449: Set connection var ansible_pipelining to False 11343 1726773102.24456: Set connection var ansible_timeout to 10 11343 1726773102.24464: Set connection var ansible_module_compression to ZIP_DEFLATED 11343 1726773102.24467: Set connection var ansible_shell_type to sh 11343 1726773102.24472: Set connection var ansible_shell_executable to /bin/sh 11343 1726773102.24477: Set connection var ansible_connection to ssh 11343 1726773102.24496: variable 'ansible_shell_executable' from source: unknown 11343 1726773102.24500: variable 'ansible_connection' from source: unknown 11343 1726773102.24506: variable 'ansible_module_compression' from source: unknown 11343 1726773102.24510: variable 'ansible_shell_type' from source: unknown 11343 1726773102.24513: variable 'ansible_shell_executable' from source: unknown 11343 1726773102.24517: variable 'ansible_host' from source: host vars for 'managed_node2' 11343 1726773102.24521: variable 'ansible_pipelining' from source: unknown 11343 1726773102.24525: variable 'ansible_timeout' from source: unknown 11343 1726773102.24529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11343 1726773102.24622: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11343 1726773102.24636: variable 'omit' from source: magic vars 11343 1726773102.24642: starting attempt loop 11343 1726773102.24645: running the handler 11343 1726773102.24898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11343 1726773102.29087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11343 1726773102.29133: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11343 1726773102.29162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11343 1726773102.29198: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11343 1726773102.29221: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11343 1726773102.29266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11343 1726773102.29289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11343 1726773102.29310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11343 1726773102.29337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11343 1726773102.29348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11343 1726773102.29422: variable 'kernel_settings_reboot_required' from source: set_fact 11343 1726773102.29437: Evaluated conditional (not kernel_settings_reboot_required | d(false)): True 11343 1726773102.29444: handler run complete 11343 1726773102.29458: attempt loop complete, returning result 11343 1726773102.29462: _execute() done 11343 1726773102.29465: dumping result to json 11343 1726773102.29469: done dumping result, returning 11343 1726773102.29474: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings_reboot_required is not set or is false [0affffe7-6841-885f-bbcf-000000000027] 11343 1726773102.29479: sending task result for task 0affffe7-6841-885f-bbcf-000000000027 11343 1726773102.29505: done sending task result for task 0affffe7-6841-885f-bbcf-000000000027 11343 1726773102.29508: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773102.29622: no more pending results, returning what we have 8240 1726773102.29626: results queue empty 8240 1726773102.29627: checking for any_errors_fatal 8240 1726773102.29629: done checking for any_errors_fatal 8240 1726773102.29630: checking for max_fail_percentage 8240 1726773102.29631: done checking for max_fail_percentage 8240 1726773102.29632: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.29633: done checking to see if all hosts have failed 8240 1726773102.29633: getting the remaining hosts for this loop 8240 1726773102.29634: done getting the remaining hosts for this loop 8240 1726773102.29638: getting the next task for host managed_node2 8240 1726773102.29643: done getting next task for host managed_node2 8240 1726773102.29646: ^ task is: TASK: Ensure role reported changed 8240 1726773102.29647: ^ state is: HOST STATE: block=2, task=49, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.29650: getting variables 8240 1726773102.29652: in VariableManager get_vars() 8240 1726773102.29688: Calling all_inventory to load vars for managed_node2 8240 1726773102.29691: Calling groups_inventory to load vars for managed_node2 8240 1726773102.29693: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.29703: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.29712: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.29714: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.29872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.29991: done with get_vars() 8240 1726773102.29999: done getting variables 8240 1726773102.30042: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure role reported changed] ******************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:198 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.065) 0:01:20.944 **** 8240 1726773102.30063: entering _queue_task() for managed_node2/assert 8240 1726773102.30224: worker is 1 (out of 1 available) 8240 1726773102.30238: exiting _queue_task() for managed_node2/assert 8240 1726773102.30250: done queuing things up, now waiting for results queue to drain 8240 1726773102.30252: waiting for pending results... 11344 1726773102.30383: running TaskExecutor() for managed_node2/TASK: Ensure role reported changed 11344 1726773102.30488: in run() - task 0affffe7-6841-885f-bbcf-000000000028 11344 1726773102.30506: variable 'ansible_search_path' from source: unknown 11344 1726773102.30534: calling self._execute() 11344 1726773102.30607: variable 'ansible_host' from source: host vars for 'managed_node2' 11344 1726773102.30616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11344 1726773102.30625: variable 'omit' from source: magic vars 11344 1726773102.30707: variable 'omit' from source: magic vars 11344 1726773102.30733: variable 'omit' from source: magic vars 11344 1726773102.30755: variable 'omit' from source: magic vars 11344 1726773102.30791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11344 1726773102.30821: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11344 1726773102.30842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11344 1726773102.30857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11344 1726773102.30868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11344 1726773102.30894: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11344 1726773102.30899: variable 'ansible_host' from source: host vars for 'managed_node2' 11344 1726773102.30906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11344 1726773102.30973: Set connection var ansible_pipelining to False 11344 1726773102.30980: Set connection var ansible_timeout to 10 11344 1726773102.30991: Set connection var ansible_module_compression to ZIP_DEFLATED 11344 1726773102.30995: Set connection var ansible_shell_type to sh 11344 1726773102.31000: Set connection var ansible_shell_executable to /bin/sh 11344 1726773102.31007: Set connection var ansible_connection to ssh 11344 1726773102.31024: variable 'ansible_shell_executable' from source: unknown 11344 1726773102.31028: variable 'ansible_connection' from source: unknown 11344 1726773102.31032: variable 'ansible_module_compression' from source: unknown 11344 1726773102.31035: variable 'ansible_shell_type' from source: unknown 11344 1726773102.31039: variable 'ansible_shell_executable' from source: unknown 11344 1726773102.31042: variable 'ansible_host' from source: host vars for 'managed_node2' 11344 1726773102.31047: variable 'ansible_pipelining' from source: unknown 11344 1726773102.31050: variable 'ansible_timeout' from source: unknown 11344 1726773102.31055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11344 1726773102.31149: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11344 1726773102.31161: variable 'omit' from source: magic vars 11344 1726773102.31167: starting attempt loop 11344 1726773102.31171: running the handler 11344 1726773102.31423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11344 1726773102.35523: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11344 1726773102.35566: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11344 1726773102.35597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11344 1726773102.35624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11344 1726773102.35643: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11344 1726773102.35699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11344 1726773102.35720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11344 1726773102.35739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11344 1726773102.35765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11344 1726773102.35776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11344 1726773102.35840: variable '__kernel_settings_changed' from source: set_fact 11344 1726773102.35854: Evaluated conditional (__kernel_settings_changed | d(false)): True 11344 1726773102.35861: handler run complete 11344 1726773102.35875: attempt loop complete, returning result 11344 1726773102.35879: _execute() done 11344 1726773102.35882: dumping result to json 11344 1726773102.35887: done dumping result, returning 11344 1726773102.35893: done running TaskExecutor() for managed_node2/TASK: Ensure role reported changed [0affffe7-6841-885f-bbcf-000000000028] 11344 1726773102.35898: sending task result for task 0affffe7-6841-885f-bbcf-000000000028 11344 1726773102.35919: done sending task result for task 0affffe7-6841-885f-bbcf-000000000028 11344 1726773102.35921: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 8240 1726773102.36088: no more pending results, returning what we have 8240 1726773102.36091: results queue empty 8240 1726773102.36093: checking for any_errors_fatal 8240 1726773102.36100: done checking for any_errors_fatal 8240 1726773102.36103: checking for max_fail_percentage 8240 1726773102.36104: done checking for max_fail_percentage 8240 1726773102.36105: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.36106: done checking to see if all hosts have failed 8240 1726773102.36106: getting the remaining hosts for this loop 8240 1726773102.36108: done getting the remaining hosts for this loop 8240 1726773102.36112: getting the next task for host managed_node2 8240 1726773102.36118: done getting next task for host managed_node2 8240 1726773102.36120: ^ task is: TASK: Check sysctl 8240 1726773102.36121: ^ state is: HOST STATE: block=2, task=50, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.36125: getting variables 8240 1726773102.36126: in VariableManager get_vars() 8240 1726773102.36160: Calling all_inventory to load vars for managed_node2 8240 1726773102.36162: Calling groups_inventory to load vars for managed_node2 8240 1726773102.36164: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.36175: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.36177: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.36188: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.36341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.38956: done with get_vars() 8240 1726773102.38963: done getting variables 8240 1726773102.38997: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysctl] ************************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:202 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.089) 0:01:21.034 **** 8240 1726773102.39015: entering _queue_task() for managed_node2/shell 8240 1726773102.39186: worker is 1 (out of 1 available) 8240 1726773102.39204: exiting _queue_task() for managed_node2/shell 8240 1726773102.39216: done queuing things up, now waiting for results queue to drain 8240 1726773102.39218: waiting for pending results... 11345 1726773102.39328: running TaskExecutor() for managed_node2/TASK: Check sysctl 11345 1726773102.39437: in run() - task 0affffe7-6841-885f-bbcf-000000000029 11345 1726773102.39455: variable 'ansible_search_path' from source: unknown 11345 1726773102.39483: calling self._execute() 11345 1726773102.39554: variable 'ansible_host' from source: host vars for 'managed_node2' 11345 1726773102.39563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11345 1726773102.39573: variable 'omit' from source: magic vars 11345 1726773102.39652: variable 'omit' from source: magic vars 11345 1726773102.39679: variable 'omit' from source: magic vars 11345 1726773102.39704: variable 'omit' from source: magic vars 11345 1726773102.39738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11345 1726773102.39765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11345 1726773102.39784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11345 1726773102.39801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11345 1726773102.39814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11345 1726773102.39837: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11345 1726773102.39843: variable 'ansible_host' from source: host vars for 'managed_node2' 11345 1726773102.39847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11345 1726773102.39917: Set connection var ansible_pipelining to False 11345 1726773102.39924: Set connection var ansible_timeout to 10 11345 1726773102.39932: Set connection var ansible_module_compression to ZIP_DEFLATED 11345 1726773102.39935: Set connection var ansible_shell_type to sh 11345 1726773102.39940: Set connection var ansible_shell_executable to /bin/sh 11345 1726773102.39946: Set connection var ansible_connection to ssh 11345 1726773102.39962: variable 'ansible_shell_executable' from source: unknown 11345 1726773102.39966: variable 'ansible_connection' from source: unknown 11345 1726773102.39969: variable 'ansible_module_compression' from source: unknown 11345 1726773102.39973: variable 'ansible_shell_type' from source: unknown 11345 1726773102.39977: variable 'ansible_shell_executable' from source: unknown 11345 1726773102.39980: variable 'ansible_host' from source: host vars for 'managed_node2' 11345 1726773102.39983: variable 'ansible_pipelining' from source: unknown 11345 1726773102.39987: variable 'ansible_timeout' from source: unknown 11345 1726773102.39990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11345 1726773102.40077: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11345 1726773102.40090: variable 'omit' from source: magic vars 11345 1726773102.40095: starting attempt loop 11345 1726773102.40097: running the handler 11345 1726773102.40104: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11345 1726773102.40117: _low_level_execute_command(): starting 11345 1726773102.40123: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11345 1726773102.42490: stdout chunk (state=2): >>>/root <<< 11345 1726773102.42615: stderr chunk (state=3): >>><<< 11345 1726773102.42622: stdout chunk (state=3): >>><<< 11345 1726773102.42640: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11345 1726773102.42654: _low_level_execute_command(): starting 11345 1726773102.42660: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293 `" && echo ansible-tmp-1726773102.4264863-11345-1719295167293="` echo /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293 `" ) && sleep 0' 11345 1726773102.45178: stdout chunk (state=2): >>>ansible-tmp-1726773102.4264863-11345-1719295167293=/root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293 <<< 11345 1726773102.45311: stderr chunk (state=3): >>><<< 11345 1726773102.45318: stdout chunk (state=3): >>><<< 11345 1726773102.45335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773102.4264863-11345-1719295167293=/root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293 , stderr= 11345 1726773102.45364: variable 'ansible_module_compression' from source: unknown 11345 1726773102.45414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11345 1726773102.45450: variable 'ansible_facts' from source: unknown 11345 1726773102.45514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/AnsiballZ_command.py 11345 1726773102.45621: Sending initial data 11345 1726773102.45629: Sent initial data (153 bytes) 11345 1726773102.48110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpgltlgmj1 /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/AnsiballZ_command.py <<< 11345 1726773102.49198: stderr chunk (state=3): >>><<< 11345 1726773102.49209: stdout chunk (state=3): >>><<< 11345 1726773102.49230: done transferring module to remote 11345 1726773102.49242: _low_level_execute_command(): starting 11345 1726773102.49247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/ /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/AnsiballZ_command.py && sleep 0' 11345 1726773102.51581: stderr chunk (state=2): >>><<< 11345 1726773102.51591: stdout chunk (state=2): >>><<< 11345 1726773102.51608: _low_level_execute_command() done: rc=0, stdout=, stderr= 11345 1726773102.51613: _low_level_execute_command(): starting 11345 1726773102.51618: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/AnsiballZ_command.py && sleep 0' 11345 1726773102.67291: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "start": "2024-09-19 15:11:42.665106", "end": "2024-09-19 15:11:42.670979", "delta": "0:00:00.005873", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11345 1726773102.68399: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11345 1726773102.68448: stderr chunk (state=3): >>><<< 11345 1726773102.68455: stdout chunk (state=3): >>><<< 11345 1726773102.68472: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "start": "2024-09-19 15:11:42.665106", "end": "2024-09-19 15:11:42.670979", "delta": "0:00:00.005873", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11345 1726773102.68520: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11345 1726773102.68531: _low_level_execute_command(): starting 11345 1726773102.68537: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773102.4264863-11345-1719295167293/ > /dev/null 2>&1 && sleep 0' 11345 1726773102.70925: stderr chunk (state=2): >>><<< 11345 1726773102.70935: stdout chunk (state=2): >>><<< 11345 1726773102.70950: _low_level_execute_command() done: rc=0, stdout=, stderr= 11345 1726773102.70957: handler run complete 11345 1726773102.70975: Evaluated conditional (False): False 11345 1726773102.70984: attempt loop complete, returning result 11345 1726773102.70989: _execute() done 11345 1726773102.70992: dumping result to json 11345 1726773102.70997: done dumping result, returning 11345 1726773102.71004: done running TaskExecutor() for managed_node2/TASK: Check sysctl [0affffe7-6841-885f-bbcf-000000000029] 11345 1726773102.71011: sending task result for task 0affffe7-6841-885f-bbcf-000000000029 11345 1726773102.71045: done sending task result for task 0affffe7-6841-885f-bbcf-000000000029 11345 1726773102.71048: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\nsysctl -n fs.file-max | grep -Lvxq 400001", "delta": "0:00:00.005873", "end": "2024-09-19 15:11:42.670979", "rc": 0, "start": "2024-09-19 15:11:42.665106" } 8240 1726773102.71188: no more pending results, returning what we have 8240 1726773102.71191: results queue empty 8240 1726773102.71192: checking for any_errors_fatal 8240 1726773102.71200: done checking for any_errors_fatal 8240 1726773102.71200: checking for max_fail_percentage 8240 1726773102.71204: done checking for max_fail_percentage 8240 1726773102.71204: checking to see if all hosts have failed and the running result is not ok 8240 1726773102.71205: done checking to see if all hosts have failed 8240 1726773102.71206: getting the remaining hosts for this loop 8240 1726773102.71207: done getting the remaining hosts for this loop 8240 1726773102.71211: getting the next task for host managed_node2 8240 1726773102.71217: done getting next task for host managed_node2 8240 1726773102.71219: ^ task is: TASK: Check sysfs after role runs 8240 1726773102.71220: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773102.71223: getting variables 8240 1726773102.71225: in VariableManager get_vars() 8240 1726773102.71259: Calling all_inventory to load vars for managed_node2 8240 1726773102.71262: Calling groups_inventory to load vars for managed_node2 8240 1726773102.71264: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773102.71274: Calling all_plugins_play to load vars for managed_node2 8240 1726773102.71277: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773102.71279: Calling groups_plugins_play to load vars for managed_node2 8240 1726773102.71398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773102.71517: done with get_vars() 8240 1726773102.71526: done getting variables 8240 1726773102.71568: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check sysfs after role runs] ********************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:208 Thursday 19 September 2024 15:11:42 -0400 (0:00:00.325) 0:01:21.359 **** 8240 1726773102.71594: entering _queue_task() for managed_node2/command 8240 1726773102.71758: worker is 1 (out of 1 available) 8240 1726773102.71771: exiting _queue_task() for managed_node2/command 8240 1726773102.71784: done queuing things up, now waiting for results queue to drain 8240 1726773102.71788: waiting for pending results... 11353 1726773102.71908: running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs 11353 1726773102.72011: in run() - task 0affffe7-6841-885f-bbcf-00000000002a 11353 1726773102.72028: variable 'ansible_search_path' from source: unknown 11353 1726773102.72056: calling self._execute() 11353 1726773102.72129: variable 'ansible_host' from source: host vars for 'managed_node2' 11353 1726773102.72138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11353 1726773102.72147: variable 'omit' from source: magic vars 11353 1726773102.72308: variable 'omit' from source: magic vars 11353 1726773102.72340: variable 'omit' from source: magic vars 11353 1726773102.72362: variable 'omit' from source: magic vars 11353 1726773102.72399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11353 1726773102.72426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11353 1726773102.72442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11353 1726773102.72455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11353 1726773102.72464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11353 1726773102.72488: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11353 1726773102.72493: variable 'ansible_host' from source: host vars for 'managed_node2' 11353 1726773102.72497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11353 1726773102.72570: Set connection var ansible_pipelining to False 11353 1726773102.72578: Set connection var ansible_timeout to 10 11353 1726773102.72587: Set connection var ansible_module_compression to ZIP_DEFLATED 11353 1726773102.72590: Set connection var ansible_shell_type to sh 11353 1726773102.72596: Set connection var ansible_shell_executable to /bin/sh 11353 1726773102.72601: Set connection var ansible_connection to ssh 11353 1726773102.72617: variable 'ansible_shell_executable' from source: unknown 11353 1726773102.72620: variable 'ansible_connection' from source: unknown 11353 1726773102.72624: variable 'ansible_module_compression' from source: unknown 11353 1726773102.72627: variable 'ansible_shell_type' from source: unknown 11353 1726773102.72631: variable 'ansible_shell_executable' from source: unknown 11353 1726773102.72634: variable 'ansible_host' from source: host vars for 'managed_node2' 11353 1726773102.72638: variable 'ansible_pipelining' from source: unknown 11353 1726773102.72642: variable 'ansible_timeout' from source: unknown 11353 1726773102.72646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11353 1726773102.72737: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11353 1726773102.72750: variable 'omit' from source: magic vars 11353 1726773102.72755: starting attempt loop 11353 1726773102.72759: running the handler 11353 1726773102.72773: _low_level_execute_command(): starting 11353 1726773102.72781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11353 1726773102.75106: stdout chunk (state=2): >>>/root <<< 11353 1726773102.75229: stderr chunk (state=3): >>><<< 11353 1726773102.75237: stdout chunk (state=3): >>><<< 11353 1726773102.75255: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11353 1726773102.75268: _low_level_execute_command(): starting 11353 1726773102.75275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713 `" && echo ansible-tmp-1726773102.7526321-11353-138910145171713="` echo /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713 `" ) && sleep 0' 11353 1726773102.77849: stdout chunk (state=2): >>>ansible-tmp-1726773102.7526321-11353-138910145171713=/root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713 <<< 11353 1726773102.77982: stderr chunk (state=3): >>><<< 11353 1726773102.77990: stdout chunk (state=3): >>><<< 11353 1726773102.78007: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773102.7526321-11353-138910145171713=/root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713 , stderr= 11353 1726773102.78034: variable 'ansible_module_compression' from source: unknown 11353 1726773102.78081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11353 1726773102.78118: variable 'ansible_facts' from source: unknown 11353 1726773102.78178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/AnsiballZ_command.py 11353 1726773102.78281: Sending initial data 11353 1726773102.78291: Sent initial data (155 bytes) 11353 1726773102.80780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpqt7ntktn /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/AnsiballZ_command.py <<< 11353 1726773102.81873: stderr chunk (state=3): >>><<< 11353 1726773102.81880: stdout chunk (state=3): >>><<< 11353 1726773102.81905: done transferring module to remote 11353 1726773102.81916: _low_level_execute_command(): starting 11353 1726773102.81921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/ /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/AnsiballZ_command.py && sleep 0' 11353 1726773102.84269: stderr chunk (state=2): >>><<< 11353 1726773102.84277: stdout chunk (state=2): >>><<< 11353 1726773102.84293: _low_level_execute_command() done: rc=0, stdout=, stderr= 11353 1726773102.84297: _low_level_execute_command(): starting 11353 1726773102.84304: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/AnsiballZ_command.py && sleep 0' 11353 1726773102.99499: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:42.989794", "end": "2024-09-19 15:11:42.993021", "delta": "0:00:00.003227", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11353 1726773103.00582: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11353 1726773103.00630: stderr chunk (state=3): >>><<< 11353 1726773103.00636: stdout chunk (state=3): >>><<< 11353 1726773103.00651: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu"], "start": "2024-09-19 15:11:42.989794", "end": "2024-09-19 15:11:42.993021", "delta": "0:00:00.003227", "msg": "", "invocation": {"module_args": {"_raw_params": "grep -Lxqv 60666 /sys/class/net/lo/mtu", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11353 1726773103.00695: done with _execute_module (ansible.legacy.command, {'_raw_params': 'grep -Lxqv 60666 /sys/class/net/lo/mtu', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11353 1726773103.00707: _low_level_execute_command(): starting 11353 1726773103.00713: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773102.7526321-11353-138910145171713/ > /dev/null 2>&1 && sleep 0' 11353 1726773103.03100: stderr chunk (state=2): >>><<< 11353 1726773103.03111: stdout chunk (state=2): >>><<< 11353 1726773103.03124: _low_level_execute_command() done: rc=0, stdout=, stderr= 11353 1726773103.03131: handler run complete 11353 1726773103.03149: Evaluated conditional (False): False 11353 1726773103.03158: attempt loop complete, returning result 11353 1726773103.03162: _execute() done 11353 1726773103.03165: dumping result to json 11353 1726773103.03170: done dumping result, returning 11353 1726773103.03178: done running TaskExecutor() for managed_node2/TASK: Check sysfs after role runs [0affffe7-6841-885f-bbcf-00000000002a] 11353 1726773103.03184: sending task result for task 0affffe7-6841-885f-bbcf-00000000002a 11353 1726773103.03220: done sending task result for task 0affffe7-6841-885f-bbcf-00000000002a 11353 1726773103.03223: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "grep", "-Lxqv", "60666", "/sys/class/net/lo/mtu" ], "delta": "0:00:00.003227", "end": "2024-09-19 15:11:42.993021", "rc": 0, "start": "2024-09-19 15:11:42.989794" } 8240 1726773103.03356: no more pending results, returning what we have 8240 1726773103.03360: results queue empty 8240 1726773103.03361: checking for any_errors_fatal 8240 1726773103.03368: done checking for any_errors_fatal 8240 1726773103.03368: checking for max_fail_percentage 8240 1726773103.03370: done checking for max_fail_percentage 8240 1726773103.03370: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.03371: done checking to see if all hosts have failed 8240 1726773103.03372: getting the remaining hosts for this loop 8240 1726773103.03373: done getting the remaining hosts for this loop 8240 1726773103.03377: getting the next task for host managed_node2 8240 1726773103.03386: done getting next task for host managed_node2 8240 1726773103.03389: ^ task is: TASK: Cleanup 8240 1726773103.03390: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773103.03393: getting variables 8240 1726773103.03395: in VariableManager get_vars() 8240 1726773103.03428: Calling all_inventory to load vars for managed_node2 8240 1726773103.03431: Calling groups_inventory to load vars for managed_node2 8240 1726773103.03432: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.03443: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.03446: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.03448: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.03610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.03717: done with get_vars() 8240 1726773103.03725: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:213 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.321) 0:01:21.681 **** 8240 1726773103.03790: entering _queue_task() for managed_node2/include_tasks 8240 1726773103.03945: worker is 1 (out of 1 available) 8240 1726773103.03959: exiting _queue_task() for managed_node2/include_tasks 8240 1726773103.03971: done queuing things up, now waiting for results queue to drain 8240 1726773103.03974: waiting for pending results... 11361 1726773103.04100: running TaskExecutor() for managed_node2/TASK: Cleanup 11361 1726773103.04215: in run() - task 0affffe7-6841-885f-bbcf-00000000002b 11361 1726773103.04231: variable 'ansible_search_path' from source: unknown 11361 1726773103.04260: calling self._execute() 11361 1726773103.04334: variable 'ansible_host' from source: host vars for 'managed_node2' 11361 1726773103.04344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11361 1726773103.04353: variable 'omit' from source: magic vars 11361 1726773103.04437: _execute() done 11361 1726773103.04443: dumping result to json 11361 1726773103.04447: done dumping result, returning 11361 1726773103.04453: done running TaskExecutor() for managed_node2/TASK: Cleanup [0affffe7-6841-885f-bbcf-00000000002b] 11361 1726773103.04459: sending task result for task 0affffe7-6841-885f-bbcf-00000000002b 11361 1726773103.04488: done sending task result for task 0affffe7-6841-885f-bbcf-00000000002b 11361 1726773103.04491: WORKER PROCESS EXITING 8240 1726773103.04581: no more pending results, returning what we have 8240 1726773103.04587: in VariableManager get_vars() 8240 1726773103.04623: Calling all_inventory to load vars for managed_node2 8240 1726773103.04626: Calling groups_inventory to load vars for managed_node2 8240 1726773103.04628: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.04638: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.04640: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.04643: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.04752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.04860: done with get_vars() 8240 1726773103.04865: variable 'ansible_search_path' from source: unknown 8240 1726773103.04875: we have included files to process 8240 1726773103.04876: generating all_blocks data 8240 1726773103.04877: done generating all_blocks data 8240 1726773103.04882: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8240 1726773103.04882: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8240 1726773103.04884: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node2 8240 1726773103.05510: done processing included file 8240 1726773103.05513: iterating over new_blocks loaded from include file 8240 1726773103.05513: in VariableManager get_vars() 8240 1726773103.05524: done with get_vars() 8240 1726773103.05525: filtering new block on tags 8240 1726773103.05540: done filtering new block on tags 8240 1726773103.05542: in VariableManager get_vars() 8240 1726773103.05550: done with get_vars() 8240 1726773103.05551: filtering new block on tags 8240 1726773103.05595: done filtering new block on tags 8240 1726773103.05596: done iterating over new_blocks loaded from include file 8240 1726773103.05597: extending task lists for all hosts with included blocks 8240 1726773103.08209: done extending task lists 8240 1726773103.08210: done processing included files 8240 1726773103.08211: results queue empty 8240 1726773103.08211: checking for any_errors_fatal 8240 1726773103.08215: done checking for any_errors_fatal 8240 1726773103.08216: checking for max_fail_percentage 8240 1726773103.08216: done checking for max_fail_percentage 8240 1726773103.08217: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.08217: done checking to see if all hosts have failed 8240 1726773103.08218: getting the remaining hosts for this loop 8240 1726773103.08219: done getting the remaining hosts for this loop 8240 1726773103.08221: getting the next task for host managed_node2 8240 1726773103.08224: done getting next task for host managed_node2 8240 1726773103.08225: ^ task is: TASK: Show current tuned profile settings 8240 1726773103.08227: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.08228: getting variables 8240 1726773103.08229: in VariableManager get_vars() 8240 1726773103.08240: Calling all_inventory to load vars for managed_node2 8240 1726773103.08241: Calling groups_inventory to load vars for managed_node2 8240 1726773103.08242: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.08247: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.08249: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.08250: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.08349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.08461: done with get_vars() 8240 1726773103.08469: done getting variables 8240 1726773103.08501: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.047) 0:01:21.729 **** 8240 1726773103.08522: entering _queue_task() for managed_node2/command 8240 1726773103.08717: worker is 1 (out of 1 available) 8240 1726773103.08731: exiting _queue_task() for managed_node2/command 8240 1726773103.08744: done queuing things up, now waiting for results queue to drain 8240 1726773103.08746: waiting for pending results... 11362 1726773103.08874: running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings 11362 1726773103.08990: in run() - task 0affffe7-6841-885f-bbcf-000000000cab 11362 1726773103.09010: variable 'ansible_search_path' from source: unknown 11362 1726773103.09014: variable 'ansible_search_path' from source: unknown 11362 1726773103.09043: calling self._execute() 11362 1726773103.09120: variable 'ansible_host' from source: host vars for 'managed_node2' 11362 1726773103.09130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11362 1726773103.09138: variable 'omit' from source: magic vars 11362 1726773103.09220: variable 'omit' from source: magic vars 11362 1726773103.09250: variable 'omit' from source: magic vars 11362 1726773103.09498: variable '__kernel_settings_profile_filename' from source: role '' exported vars 11362 1726773103.09558: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11362 1726773103.09630: variable '__kernel_settings_profile_parent' from source: set_fact 11362 1726773103.09639: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11362 1726773103.09674: variable 'omit' from source: magic vars 11362 1726773103.09709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11362 1726773103.09738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11362 1726773103.09757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11362 1726773103.09772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11362 1726773103.09783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11362 1726773103.09812: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11362 1726773103.09818: variable 'ansible_host' from source: host vars for 'managed_node2' 11362 1726773103.09823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11362 1726773103.09893: Set connection var ansible_pipelining to False 11362 1726773103.09903: Set connection var ansible_timeout to 10 11362 1726773103.09911: Set connection var ansible_module_compression to ZIP_DEFLATED 11362 1726773103.09914: Set connection var ansible_shell_type to sh 11362 1726773103.09920: Set connection var ansible_shell_executable to /bin/sh 11362 1726773103.09925: Set connection var ansible_connection to ssh 11362 1726773103.09940: variable 'ansible_shell_executable' from source: unknown 11362 1726773103.09944: variable 'ansible_connection' from source: unknown 11362 1726773103.09947: variable 'ansible_module_compression' from source: unknown 11362 1726773103.09950: variable 'ansible_shell_type' from source: unknown 11362 1726773103.09953: variable 'ansible_shell_executable' from source: unknown 11362 1726773103.09955: variable 'ansible_host' from source: host vars for 'managed_node2' 11362 1726773103.09958: variable 'ansible_pipelining' from source: unknown 11362 1726773103.09959: variable 'ansible_timeout' from source: unknown 11362 1726773103.09961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11362 1726773103.10064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11362 1726773103.10075: variable 'omit' from source: magic vars 11362 1726773103.10081: starting attempt loop 11362 1726773103.10086: running the handler 11362 1726773103.10099: _low_level_execute_command(): starting 11362 1726773103.10108: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11362 1726773103.12454: stdout chunk (state=2): >>>/root <<< 11362 1726773103.12576: stderr chunk (state=3): >>><<< 11362 1726773103.12582: stdout chunk (state=3): >>><<< 11362 1726773103.12604: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11362 1726773103.12616: _low_level_execute_command(): starting 11362 1726773103.12622: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918 `" && echo ansible-tmp-1726773103.1261163-11362-121282711261918="` echo /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918 `" ) && sleep 0' 11362 1726773103.15160: stdout chunk (state=2): >>>ansible-tmp-1726773103.1261163-11362-121282711261918=/root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918 <<< 11362 1726773103.15287: stderr chunk (state=3): >>><<< 11362 1726773103.15293: stdout chunk (state=3): >>><<< 11362 1726773103.15307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773103.1261163-11362-121282711261918=/root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918 , stderr= 11362 1726773103.15329: variable 'ansible_module_compression' from source: unknown 11362 1726773103.15368: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11362 1726773103.15401: variable 'ansible_facts' from source: unknown 11362 1726773103.15462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/AnsiballZ_command.py 11362 1726773103.15557: Sending initial data 11362 1726773103.15564: Sent initial data (155 bytes) 11362 1726773103.18028: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpo_lp8w98 /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/AnsiballZ_command.py <<< 11362 1726773103.19124: stderr chunk (state=3): >>><<< 11362 1726773103.19134: stdout chunk (state=3): >>><<< 11362 1726773103.19153: done transferring module to remote 11362 1726773103.19164: _low_level_execute_command(): starting 11362 1726773103.19169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/ /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/AnsiballZ_command.py && sleep 0' 11362 1726773103.21503: stderr chunk (state=2): >>><<< 11362 1726773103.21513: stdout chunk (state=2): >>><<< 11362 1726773103.21527: _low_level_execute_command() done: rc=0, stdout=, stderr= 11362 1726773103.21532: _low_level_execute_command(): starting 11362 1726773103.21537: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/AnsiballZ_command.py && sleep 0' 11362 1726773103.36832: stdout chunk (state=2): >>> {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:11:43.363525", "end": "2024-09-19 15:11:43.366323", "delta": "0:00:00.002798", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11362 1726773103.37962: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11362 1726773103.38013: stderr chunk (state=3): >>><<< 11362 1726773103.38020: stdout chunk (state=3): >>><<< 11362 1726773103.38038: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "#\n# Ansible managed\n#\n# system_role:kernel_settings\n\n[main]\nsummary = kernel settings\n[vm]\ntransparent_hugepages = never", "stderr": "", "rc": 0, "cmd": ["cat", "/etc/tuned/kernel_settings/tuned.conf"], "start": "2024-09-19 15:11:43.363525", "end": "2024-09-19 15:11:43.366323", "delta": "0:00:00.002798", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /etc/tuned/kernel_settings/tuned.conf", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11362 1726773103.38071: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11362 1726773103.38082: _low_level_execute_command(): starting 11362 1726773103.38090: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773103.1261163-11362-121282711261918/ > /dev/null 2>&1 && sleep 0' 11362 1726773103.40491: stderr chunk (state=2): >>><<< 11362 1726773103.40500: stdout chunk (state=2): >>><<< 11362 1726773103.40515: _low_level_execute_command() done: rc=0, stdout=, stderr= 11362 1726773103.40523: handler run complete 11362 1726773103.40541: Evaluated conditional (False): False 11362 1726773103.40550: attempt loop complete, returning result 11362 1726773103.40554: _execute() done 11362 1726773103.40557: dumping result to json 11362 1726773103.40563: done dumping result, returning 11362 1726773103.40569: done running TaskExecutor() for managed_node2/TASK: Show current tuned profile settings [0affffe7-6841-885f-bbcf-000000000cab] 11362 1726773103.40575: sending task result for task 0affffe7-6841-885f-bbcf-000000000cab 11362 1726773103.40606: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cab 11362 1726773103.40610: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "cat", "/etc/tuned/kernel_settings/tuned.conf" ], "delta": "0:00:00.002798", "end": "2024-09-19 15:11:43.366323", "rc": 0, "start": "2024-09-19 15:11:43.363525" } STDOUT: # # Ansible managed # # system_role:kernel_settings [main] summary = kernel settings [vm] transparent_hugepages = never 8240 1726773103.40750: no more pending results, returning what we have 8240 1726773103.40753: results queue empty 8240 1726773103.40754: checking for any_errors_fatal 8240 1726773103.40756: done checking for any_errors_fatal 8240 1726773103.40756: checking for max_fail_percentage 8240 1726773103.40758: done checking for max_fail_percentage 8240 1726773103.40759: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.40760: done checking to see if all hosts have failed 8240 1726773103.40760: getting the remaining hosts for this loop 8240 1726773103.40761: done getting the remaining hosts for this loop 8240 1726773103.40765: getting the next task for host managed_node2 8240 1726773103.40775: done getting next task for host managed_node2 8240 1726773103.40778: ^ task is: TASK: Run role with purge to remove everything 8240 1726773103.40780: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.40783: getting variables 8240 1726773103.40786: in VariableManager get_vars() 8240 1726773103.40824: Calling all_inventory to load vars for managed_node2 8240 1726773103.40826: Calling groups_inventory to load vars for managed_node2 8240 1726773103.40828: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.40839: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.40842: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.40844: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.40965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.41087: done with get_vars() 8240 1726773103.41095: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.326) 0:01:22.055 **** 8240 1726773103.41165: entering _queue_task() for managed_node2/include_role 8240 1726773103.41338: worker is 1 (out of 1 available) 8240 1726773103.41352: exiting _queue_task() for managed_node2/include_role 8240 1726773103.41365: done queuing things up, now waiting for results queue to drain 8240 1726773103.41367: waiting for pending results... 11373 1726773103.41488: running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything 11373 1726773103.41605: in run() - task 0affffe7-6841-885f-bbcf-000000000cad 11373 1726773103.41622: variable 'ansible_search_path' from source: unknown 11373 1726773103.41626: variable 'ansible_search_path' from source: unknown 11373 1726773103.41654: calling self._execute() 11373 1726773103.41731: variable 'ansible_host' from source: host vars for 'managed_node2' 11373 1726773103.41740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11373 1726773103.41749: variable 'omit' from source: magic vars 11373 1726773103.41827: _execute() done 11373 1726773103.41834: dumping result to json 11373 1726773103.41839: done dumping result, returning 11373 1726773103.41844: done running TaskExecutor() for managed_node2/TASK: Run role with purge to remove everything [0affffe7-6841-885f-bbcf-000000000cad] 11373 1726773103.41851: sending task result for task 0affffe7-6841-885f-bbcf-000000000cad 11373 1726773103.41881: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cad 11373 1726773103.41887: WORKER PROCESS EXITING 8240 1726773103.41993: no more pending results, returning what we have 8240 1726773103.41997: in VariableManager get_vars() 8240 1726773103.42036: Calling all_inventory to load vars for managed_node2 8240 1726773103.42039: Calling groups_inventory to load vars for managed_node2 8240 1726773103.42041: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.42051: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.42053: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.42056: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.42206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.42313: done with get_vars() 8240 1726773103.42318: variable 'ansible_search_path' from source: unknown 8240 1726773103.42319: variable 'ansible_search_path' from source: unknown 8240 1726773103.42511: variable 'omit' from source: magic vars 8240 1726773103.42532: variable 'omit' from source: magic vars 8240 1726773103.42541: variable 'omit' from source: magic vars 8240 1726773103.42544: we have included files to process 8240 1726773103.42545: generating all_blocks data 8240 1726773103.42545: done generating all_blocks data 8240 1726773103.42548: processing included file: fedora.linux_system_roles.kernel_settings 8240 1726773103.42562: in VariableManager get_vars() 8240 1726773103.42572: done with get_vars() 8240 1726773103.42594: in VariableManager get_vars() 8240 1726773103.42607: done with get_vars() 8240 1726773103.42634: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8240 1726773103.42672: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8240 1726773103.42690: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8240 1726773103.42739: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8240 1726773103.43069: in VariableManager get_vars() 8240 1726773103.43083: done with get_vars() 8240 1726773103.43887: in VariableManager get_vars() 8240 1726773103.43904: done with get_vars() 8240 1726773103.44007: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8240 1726773103.44402: iterating over new_blocks loaded from include file 8240 1726773103.44403: in VariableManager get_vars() 8240 1726773103.44427: done with get_vars() 8240 1726773103.44428: filtering new block on tags 8240 1726773103.44455: done filtering new block on tags 8240 1726773103.44456: in VariableManager get_vars() 8240 1726773103.44466: done with get_vars() 8240 1726773103.44467: filtering new block on tags 8240 1726773103.44496: done filtering new block on tags 8240 1726773103.44497: in VariableManager get_vars() 8240 1726773103.44508: done with get_vars() 8240 1726773103.44509: filtering new block on tags 8240 1726773103.44610: done filtering new block on tags 8240 1726773103.44612: in VariableManager get_vars() 8240 1726773103.44624: done with get_vars() 8240 1726773103.44625: filtering new block on tags 8240 1726773103.44635: done filtering new block on tags 8240 1726773103.44636: done iterating over new_blocks loaded from include file 8240 1726773103.44637: extending task lists for all hosts with included blocks 8240 1726773103.44821: done extending task lists 8240 1726773103.44822: done processing included files 8240 1726773103.44822: results queue empty 8240 1726773103.44823: checking for any_errors_fatal 8240 1726773103.44826: done checking for any_errors_fatal 8240 1726773103.44826: checking for max_fail_percentage 8240 1726773103.44827: done checking for max_fail_percentage 8240 1726773103.44827: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.44827: done checking to see if all hosts have failed 8240 1726773103.44828: getting the remaining hosts for this loop 8240 1726773103.44829: done getting the remaining hosts for this loop 8240 1726773103.44830: getting the next task for host managed_node2 8240 1726773103.44833: done getting next task for host managed_node2 8240 1726773103.44835: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8240 1726773103.44836: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.44843: getting variables 8240 1726773103.44843: in VariableManager get_vars() 8240 1726773103.44851: Calling all_inventory to load vars for managed_node2 8240 1726773103.44853: Calling groups_inventory to load vars for managed_node2 8240 1726773103.44854: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.44857: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.44859: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.44860: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.44938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.45046: done with get_vars() 8240 1726773103.45053: done getting variables 8240 1726773103.45076: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.039) 0:01:22.094 **** 8240 1726773103.45106: entering _queue_task() for managed_node2/fail 8240 1726773103.45277: worker is 1 (out of 1 available) 8240 1726773103.45291: exiting _queue_task() for managed_node2/fail 8240 1726773103.45304: done queuing things up, now waiting for results queue to drain 8240 1726773103.45306: waiting for pending results... 11374 1726773103.45437: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 11374 1726773103.45563: in run() - task 0affffe7-6841-885f-bbcf-000000000ed1 11374 1726773103.45580: variable 'ansible_search_path' from source: unknown 11374 1726773103.45584: variable 'ansible_search_path' from source: unknown 11374 1726773103.45615: calling self._execute() 11374 1726773103.45686: variable 'ansible_host' from source: host vars for 'managed_node2' 11374 1726773103.45695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11374 1726773103.45706: variable 'omit' from source: magic vars 11374 1726773103.46064: variable 'kernel_settings_sysctl' from source: include params 11374 1726773103.46074: variable '__kernel_settings_state_empty' from source: role '' all vars 11374 1726773103.46089: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 11374 1726773103.46284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11374 1726773103.48012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11374 1726773103.48060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11374 1726773103.48103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11374 1726773103.48130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11374 1726773103.48150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11374 1726773103.48210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11374 1726773103.48232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11374 1726773103.48251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11374 1726773103.48278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11374 1726773103.48291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11374 1726773103.48330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11374 1726773103.48348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11374 1726773103.48366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11374 1726773103.48394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11374 1726773103.48408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11374 1726773103.48436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11374 1726773103.48454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11374 1726773103.48473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11374 1726773103.48499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11374 1726773103.48513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11374 1726773103.48699: variable 'kernel_settings_sysctl' from source: include params 11374 1726773103.48722: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 11374 1726773103.48727: when evaluation is False, skipping this task 11374 1726773103.48731: _execute() done 11374 1726773103.48735: dumping result to json 11374 1726773103.48739: done dumping result, returning 11374 1726773103.48745: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-885f-bbcf-000000000ed1] 11374 1726773103.48751: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed1 11374 1726773103.48775: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed1 11374 1726773103.48778: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8240 1726773103.48888: no more pending results, returning what we have 8240 1726773103.48892: results queue empty 8240 1726773103.48893: checking for any_errors_fatal 8240 1726773103.48895: done checking for any_errors_fatal 8240 1726773103.48895: checking for max_fail_percentage 8240 1726773103.48897: done checking for max_fail_percentage 8240 1726773103.48898: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.48899: done checking to see if all hosts have failed 8240 1726773103.48899: getting the remaining hosts for this loop 8240 1726773103.48901: done getting the remaining hosts for this loop 8240 1726773103.48904: getting the next task for host managed_node2 8240 1726773103.48912: done getting next task for host managed_node2 8240 1726773103.48916: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8240 1726773103.48919: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.48942: getting variables 8240 1726773103.48944: in VariableManager get_vars() 8240 1726773103.48976: Calling all_inventory to load vars for managed_node2 8240 1726773103.48978: Calling groups_inventory to load vars for managed_node2 8240 1726773103.48980: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.48991: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.48994: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.48996: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.49119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.49404: done with get_vars() 8240 1726773103.49411: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.043) 0:01:22.138 **** 8240 1726773103.49474: entering _queue_task() for managed_node2/include_tasks 8240 1726773103.49639: worker is 1 (out of 1 available) 8240 1726773103.49653: exiting _queue_task() for managed_node2/include_tasks 8240 1726773103.49666: done queuing things up, now waiting for results queue to drain 8240 1726773103.49667: waiting for pending results... 11375 1726773103.49793: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 11375 1726773103.49923: in run() - task 0affffe7-6841-885f-bbcf-000000000ed2 11375 1726773103.49939: variable 'ansible_search_path' from source: unknown 11375 1726773103.49943: variable 'ansible_search_path' from source: unknown 11375 1726773103.49970: calling self._execute() 11375 1726773103.50041: variable 'ansible_host' from source: host vars for 'managed_node2' 11375 1726773103.50049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11375 1726773103.50058: variable 'omit' from source: magic vars 11375 1726773103.50141: _execute() done 11375 1726773103.50147: dumping result to json 11375 1726773103.50151: done dumping result, returning 11375 1726773103.50158: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-885f-bbcf-000000000ed2] 11375 1726773103.50164: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed2 11375 1726773103.50192: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed2 11375 1726773103.50196: WORKER PROCESS EXITING 8240 1726773103.50301: no more pending results, returning what we have 8240 1726773103.50305: in VariableManager get_vars() 8240 1726773103.50341: Calling all_inventory to load vars for managed_node2 8240 1726773103.50344: Calling groups_inventory to load vars for managed_node2 8240 1726773103.50345: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.50354: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.50357: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.50359: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.50472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.50592: done with get_vars() 8240 1726773103.50597: variable 'ansible_search_path' from source: unknown 8240 1726773103.50597: variable 'ansible_search_path' from source: unknown 8240 1726773103.50621: we have included files to process 8240 1726773103.50621: generating all_blocks data 8240 1726773103.50622: done generating all_blocks data 8240 1726773103.50628: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773103.50629: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8240 1726773103.50630: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node2 8240 1726773103.51066: done processing included file 8240 1726773103.51069: iterating over new_blocks loaded from include file 8240 1726773103.51069: in VariableManager get_vars() 8240 1726773103.51088: done with get_vars() 8240 1726773103.51090: filtering new block on tags 8240 1726773103.51109: done filtering new block on tags 8240 1726773103.51130: in VariableManager get_vars() 8240 1726773103.51145: done with get_vars() 8240 1726773103.51146: filtering new block on tags 8240 1726773103.51172: done filtering new block on tags 8240 1726773103.51174: in VariableManager get_vars() 8240 1726773103.51190: done with get_vars() 8240 1726773103.51191: filtering new block on tags 8240 1726773103.51216: done filtering new block on tags 8240 1726773103.51218: in VariableManager get_vars() 8240 1726773103.51234: done with get_vars() 8240 1726773103.51235: filtering new block on tags 8240 1726773103.51249: done filtering new block on tags 8240 1726773103.51250: done iterating over new_blocks loaded from include file 8240 1726773103.51251: extending task lists for all hosts with included blocks 8240 1726773103.51382: done extending task lists 8240 1726773103.51383: done processing included files 8240 1726773103.51383: results queue empty 8240 1726773103.51384: checking for any_errors_fatal 8240 1726773103.51388: done checking for any_errors_fatal 8240 1726773103.51388: checking for max_fail_percentage 8240 1726773103.51389: done checking for max_fail_percentage 8240 1726773103.51389: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.51389: done checking to see if all hosts have failed 8240 1726773103.51390: getting the remaining hosts for this loop 8240 1726773103.51391: done getting the remaining hosts for this loop 8240 1726773103.51393: getting the next task for host managed_node2 8240 1726773103.51396: done getting next task for host managed_node2 8240 1726773103.51397: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8240 1726773103.51400: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.51407: getting variables 8240 1726773103.51408: in VariableManager get_vars() 8240 1726773103.51416: Calling all_inventory to load vars for managed_node2 8240 1726773103.51418: Calling groups_inventory to load vars for managed_node2 8240 1726773103.51419: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.51422: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.51423: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.51425: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.51499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.51612: done with get_vars() 8240 1726773103.51618: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.021) 0:01:22.160 **** 8240 1726773103.51664: entering _queue_task() for managed_node2/setup 8240 1726773103.51833: worker is 1 (out of 1 available) 8240 1726773103.51846: exiting _queue_task() for managed_node2/setup 8240 1726773103.51858: done queuing things up, now waiting for results queue to drain 8240 1726773103.51859: waiting for pending results... 11376 1726773103.51988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 11376 1726773103.52137: in run() - task 0affffe7-6841-885f-bbcf-000000000f4d 11376 1726773103.52153: variable 'ansible_search_path' from source: unknown 11376 1726773103.52157: variable 'ansible_search_path' from source: unknown 11376 1726773103.52184: calling self._execute() 11376 1726773103.52253: variable 'ansible_host' from source: host vars for 'managed_node2' 11376 1726773103.52262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11376 1726773103.52271: variable 'omit' from source: magic vars 11376 1726773103.52639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11376 1726773103.54220: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11376 1726773103.54270: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11376 1726773103.54304: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11376 1726773103.54331: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11376 1726773103.54352: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11376 1726773103.54412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11376 1726773103.54434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11376 1726773103.54452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11376 1726773103.54479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11376 1726773103.54493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11376 1726773103.54534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11376 1726773103.54552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11376 1726773103.54570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11376 1726773103.54597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11376 1726773103.54610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11376 1726773103.54731: variable '__kernel_settings_required_facts' from source: role '' all vars 11376 1726773103.54743: variable 'ansible_facts' from source: unknown 11376 1726773103.54810: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11376 1726773103.54816: when evaluation is False, skipping this task 11376 1726773103.54820: _execute() done 11376 1726773103.54824: dumping result to json 11376 1726773103.54828: done dumping result, returning 11376 1726773103.54835: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-885f-bbcf-000000000f4d] 11376 1726773103.54841: sending task result for task 0affffe7-6841-885f-bbcf-000000000f4d 11376 1726773103.54865: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f4d 11376 1726773103.54868: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8240 1726773103.54979: no more pending results, returning what we have 8240 1726773103.54983: results queue empty 8240 1726773103.54984: checking for any_errors_fatal 8240 1726773103.54988: done checking for any_errors_fatal 8240 1726773103.54988: checking for max_fail_percentage 8240 1726773103.54990: done checking for max_fail_percentage 8240 1726773103.54990: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.54992: done checking to see if all hosts have failed 8240 1726773103.54992: getting the remaining hosts for this loop 8240 1726773103.54993: done getting the remaining hosts for this loop 8240 1726773103.54997: getting the next task for host managed_node2 8240 1726773103.55007: done getting next task for host managed_node2 8240 1726773103.55010: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8240 1726773103.55015: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.55033: getting variables 8240 1726773103.55035: in VariableManager get_vars() 8240 1726773103.55070: Calling all_inventory to load vars for managed_node2 8240 1726773103.55073: Calling groups_inventory to load vars for managed_node2 8240 1726773103.55075: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.55084: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.55090: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.55092: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.55220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.55348: done with get_vars() 8240 1726773103.55356: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.037) 0:01:22.198 **** 8240 1726773103.55432: entering _queue_task() for managed_node2/stat 8240 1726773103.55597: worker is 1 (out of 1 available) 8240 1726773103.55611: exiting _queue_task() for managed_node2/stat 8240 1726773103.55625: done queuing things up, now waiting for results queue to drain 8240 1726773103.55627: waiting for pending results... 11377 1726773103.55761: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 11377 1726773103.55911: in run() - task 0affffe7-6841-885f-bbcf-000000000f4f 11377 1726773103.55927: variable 'ansible_search_path' from source: unknown 11377 1726773103.55931: variable 'ansible_search_path' from source: unknown 11377 1726773103.55958: calling self._execute() 11377 1726773103.56030: variable 'ansible_host' from source: host vars for 'managed_node2' 11377 1726773103.56039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11377 1726773103.56048: variable 'omit' from source: magic vars 11377 1726773103.56386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11377 1726773103.56608: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11377 1726773103.56644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11377 1726773103.56669: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11377 1726773103.56699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11377 1726773103.56760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11377 1726773103.56780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11377 1726773103.56803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11377 1726773103.56822: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11377 1726773103.56917: variable '__kernel_settings_is_ostree' from source: set_fact 11377 1726773103.56929: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11377 1726773103.56934: when evaluation is False, skipping this task 11377 1726773103.56938: _execute() done 11377 1726773103.56942: dumping result to json 11377 1726773103.56945: done dumping result, returning 11377 1726773103.56951: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-885f-bbcf-000000000f4f] 11377 1726773103.56957: sending task result for task 0affffe7-6841-885f-bbcf-000000000f4f 11377 1726773103.56980: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f4f 11377 1726773103.56983: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773103.57094: no more pending results, returning what we have 8240 1726773103.57097: results queue empty 8240 1726773103.57098: checking for any_errors_fatal 8240 1726773103.57107: done checking for any_errors_fatal 8240 1726773103.57107: checking for max_fail_percentage 8240 1726773103.57109: done checking for max_fail_percentage 8240 1726773103.57109: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.57110: done checking to see if all hosts have failed 8240 1726773103.57111: getting the remaining hosts for this loop 8240 1726773103.57112: done getting the remaining hosts for this loop 8240 1726773103.57116: getting the next task for host managed_node2 8240 1726773103.57123: done getting next task for host managed_node2 8240 1726773103.57126: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8240 1726773103.57130: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.57147: getting variables 8240 1726773103.57148: in VariableManager get_vars() 8240 1726773103.57181: Calling all_inventory to load vars for managed_node2 8240 1726773103.57184: Calling groups_inventory to load vars for managed_node2 8240 1726773103.57187: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.57196: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.57199: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.57202: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.57310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.57454: done with get_vars() 8240 1726773103.57460: done getting variables 8240 1726773103.57502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.020) 0:01:22.219 **** 8240 1726773103.57528: entering _queue_task() for managed_node2/set_fact 8240 1726773103.57688: worker is 1 (out of 1 available) 8240 1726773103.57702: exiting _queue_task() for managed_node2/set_fact 8240 1726773103.57716: done queuing things up, now waiting for results queue to drain 8240 1726773103.57717: waiting for pending results... 11378 1726773103.57847: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 11378 1726773103.57975: in run() - task 0affffe7-6841-885f-bbcf-000000000f50 11378 1726773103.57994: variable 'ansible_search_path' from source: unknown 11378 1726773103.57999: variable 'ansible_search_path' from source: unknown 11378 1726773103.58028: calling self._execute() 11378 1726773103.58098: variable 'ansible_host' from source: host vars for 'managed_node2' 11378 1726773103.58108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11378 1726773103.58117: variable 'omit' from source: magic vars 11378 1726773103.58449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11378 1726773103.58626: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11378 1726773103.58659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11378 1726773103.58689: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11378 1726773103.58719: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11378 1726773103.58776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11378 1726773103.58797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11378 1726773103.58819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11378 1726773103.58838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11378 1726773103.58927: variable '__kernel_settings_is_ostree' from source: set_fact 11378 1726773103.58938: Evaluated conditional (not __kernel_settings_is_ostree is defined): False 11378 1726773103.58942: when evaluation is False, skipping this task 11378 1726773103.58946: _execute() done 11378 1726773103.58950: dumping result to json 11378 1726773103.58954: done dumping result, returning 11378 1726773103.58960: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-885f-bbcf-000000000f50] 11378 1726773103.58965: sending task result for task 0affffe7-6841-885f-bbcf-000000000f50 11378 1726773103.58989: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f50 11378 1726773103.58992: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_ostree is defined", "skip_reason": "Conditional result was False" } 8240 1726773103.59091: no more pending results, returning what we have 8240 1726773103.59094: results queue empty 8240 1726773103.59095: checking for any_errors_fatal 8240 1726773103.59102: done checking for any_errors_fatal 8240 1726773103.59102: checking for max_fail_percentage 8240 1726773103.59104: done checking for max_fail_percentage 8240 1726773103.59105: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.59105: done checking to see if all hosts have failed 8240 1726773103.59106: getting the remaining hosts for this loop 8240 1726773103.59107: done getting the remaining hosts for this loop 8240 1726773103.59111: getting the next task for host managed_node2 8240 1726773103.59120: done getting next task for host managed_node2 8240 1726773103.59123: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8240 1726773103.59127: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.59143: getting variables 8240 1726773103.59145: in VariableManager get_vars() 8240 1726773103.59176: Calling all_inventory to load vars for managed_node2 8240 1726773103.59179: Calling groups_inventory to load vars for managed_node2 8240 1726773103.59181: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.59190: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.59192: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.59194: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.59299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.59416: done with get_vars() 8240 1726773103.59424: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.019) 0:01:22.238 **** 8240 1726773103.59490: entering _queue_task() for managed_node2/stat 8240 1726773103.59637: worker is 1 (out of 1 available) 8240 1726773103.59650: exiting _queue_task() for managed_node2/stat 8240 1726773103.59663: done queuing things up, now waiting for results queue to drain 8240 1726773103.59665: waiting for pending results... 11379 1726773103.59788: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 11379 1726773103.59915: in run() - task 0affffe7-6841-885f-bbcf-000000000f52 11379 1726773103.59932: variable 'ansible_search_path' from source: unknown 11379 1726773103.59936: variable 'ansible_search_path' from source: unknown 11379 1726773103.59963: calling self._execute() 11379 1726773103.60035: variable 'ansible_host' from source: host vars for 'managed_node2' 11379 1726773103.60043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11379 1726773103.60052: variable 'omit' from source: magic vars 11379 1726773103.60384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11379 1726773103.60615: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11379 1726773103.60648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11379 1726773103.60674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11379 1726773103.60700: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11379 1726773103.60759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11379 1726773103.60780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11379 1726773103.60800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11379 1726773103.60821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11379 1726773103.60909: variable '__kernel_settings_is_transactional' from source: set_fact 11379 1726773103.60920: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11379 1726773103.60924: when evaluation is False, skipping this task 11379 1726773103.60928: _execute() done 11379 1726773103.60931: dumping result to json 11379 1726773103.60935: done dumping result, returning 11379 1726773103.60941: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-885f-bbcf-000000000f52] 11379 1726773103.60946: sending task result for task 0affffe7-6841-885f-bbcf-000000000f52 11379 1726773103.60969: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f52 11379 1726773103.60972: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773103.61074: no more pending results, returning what we have 8240 1726773103.61077: results queue empty 8240 1726773103.61078: checking for any_errors_fatal 8240 1726773103.61088: done checking for any_errors_fatal 8240 1726773103.61089: checking for max_fail_percentage 8240 1726773103.61091: done checking for max_fail_percentage 8240 1726773103.61091: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.61092: done checking to see if all hosts have failed 8240 1726773103.61092: getting the remaining hosts for this loop 8240 1726773103.61094: done getting the remaining hosts for this loop 8240 1726773103.61097: getting the next task for host managed_node2 8240 1726773103.61104: done getting next task for host managed_node2 8240 1726773103.61108: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8240 1726773103.61112: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.61128: getting variables 8240 1726773103.61129: in VariableManager get_vars() 8240 1726773103.61160: Calling all_inventory to load vars for managed_node2 8240 1726773103.61162: Calling groups_inventory to load vars for managed_node2 8240 1726773103.61164: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.61173: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.61175: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.61177: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.61321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.61436: done with get_vars() 8240 1726773103.61443: done getting variables 8240 1726773103.61480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.020) 0:01:22.258 **** 8240 1726773103.61508: entering _queue_task() for managed_node2/set_fact 8240 1726773103.61657: worker is 1 (out of 1 available) 8240 1726773103.61671: exiting _queue_task() for managed_node2/set_fact 8240 1726773103.61684: done queuing things up, now waiting for results queue to drain 8240 1726773103.61687: waiting for pending results... 11380 1726773103.61812: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 11380 1726773103.61941: in run() - task 0affffe7-6841-885f-bbcf-000000000f53 11380 1726773103.61957: variable 'ansible_search_path' from source: unknown 11380 1726773103.61961: variable 'ansible_search_path' from source: unknown 11380 1726773103.61988: calling self._execute() 11380 1726773103.62060: variable 'ansible_host' from source: host vars for 'managed_node2' 11380 1726773103.62069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11380 1726773103.62077: variable 'omit' from source: magic vars 11380 1726773103.62412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11380 1726773103.62587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11380 1726773103.62625: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11380 1726773103.62651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11380 1726773103.62679: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11380 1726773103.62739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11380 1726773103.62759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11380 1726773103.62779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11380 1726773103.62800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11380 1726773103.62889: variable '__kernel_settings_is_transactional' from source: set_fact 11380 1726773103.62900: Evaluated conditional (not __kernel_settings_is_transactional is defined): False 11380 1726773103.62907: when evaluation is False, skipping this task 11380 1726773103.62911: _execute() done 11380 1726773103.62915: dumping result to json 11380 1726773103.62919: done dumping result, returning 11380 1726773103.62924: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-885f-bbcf-000000000f53] 11380 1726773103.62930: sending task result for task 0affffe7-6841-885f-bbcf-000000000f53 11380 1726773103.62953: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f53 11380 1726773103.62956: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __kernel_settings_is_transactional is defined", "skip_reason": "Conditional result was False" } 8240 1726773103.63053: no more pending results, returning what we have 8240 1726773103.63056: results queue empty 8240 1726773103.63057: checking for any_errors_fatal 8240 1726773103.63064: done checking for any_errors_fatal 8240 1726773103.63065: checking for max_fail_percentage 8240 1726773103.63067: done checking for max_fail_percentage 8240 1726773103.63067: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.63068: done checking to see if all hosts have failed 8240 1726773103.63069: getting the remaining hosts for this loop 8240 1726773103.63070: done getting the remaining hosts for this loop 8240 1726773103.63073: getting the next task for host managed_node2 8240 1726773103.63082: done getting next task for host managed_node2 8240 1726773103.63089: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8240 1726773103.63093: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.63110: getting variables 8240 1726773103.63112: in VariableManager get_vars() 8240 1726773103.63143: Calling all_inventory to load vars for managed_node2 8240 1726773103.63145: Calling groups_inventory to load vars for managed_node2 8240 1726773103.63147: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.63155: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.63157: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.63159: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.63266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.63386: done with get_vars() 8240 1726773103.63394: done getting variables 8240 1726773103.63433: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.019) 0:01:22.278 **** 8240 1726773103.63459: entering _queue_task() for managed_node2/include_vars 8240 1726773103.63612: worker is 1 (out of 1 available) 8240 1726773103.63626: exiting _queue_task() for managed_node2/include_vars 8240 1726773103.63639: done queuing things up, now waiting for results queue to drain 8240 1726773103.63641: waiting for pending results... 11381 1726773103.63766: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 11381 1726773103.63897: in run() - task 0affffe7-6841-885f-bbcf-000000000f55 11381 1726773103.63913: variable 'ansible_search_path' from source: unknown 11381 1726773103.63918: variable 'ansible_search_path' from source: unknown 11381 1726773103.63944: calling self._execute() 11381 1726773103.64014: variable 'ansible_host' from source: host vars for 'managed_node2' 11381 1726773103.64023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11381 1726773103.64031: variable 'omit' from source: magic vars 11381 1726773103.64108: variable 'omit' from source: magic vars 11381 1726773103.64155: variable 'omit' from source: magic vars 11381 1726773103.64423: variable 'ffparams' from source: task vars 11381 1726773103.64572: variable 'ansible_facts' from source: unknown 11381 1726773103.64697: variable 'ansible_facts' from source: unknown 11381 1726773103.64784: variable 'ansible_facts' from source: unknown 11381 1726773103.64871: variable 'ansible_facts' from source: unknown 11381 1726773103.64950: variable 'role_path' from source: magic vars 11381 1726773103.65065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11381 1726773103.65208: Loaded config def from plugin (lookup/first_found) 11381 1726773103.65217: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 11381 1726773103.65244: variable 'ansible_search_path' from source: unknown 11381 1726773103.65261: variable 'ansible_search_path' from source: unknown 11381 1726773103.65270: variable 'ansible_search_path' from source: unknown 11381 1726773103.65276: variable 'ansible_search_path' from source: unknown 11381 1726773103.65281: variable 'ansible_search_path' from source: unknown 11381 1726773103.65297: variable 'omit' from source: magic vars 11381 1726773103.65317: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11381 1726773103.65335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11381 1726773103.65352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11381 1726773103.65365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11381 1726773103.65373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11381 1726773103.65397: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11381 1726773103.65404: variable 'ansible_host' from source: host vars for 'managed_node2' 11381 1726773103.65409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11381 1726773103.65469: Set connection var ansible_pipelining to False 11381 1726773103.65476: Set connection var ansible_timeout to 10 11381 1726773103.65484: Set connection var ansible_module_compression to ZIP_DEFLATED 11381 1726773103.65490: Set connection var ansible_shell_type to sh 11381 1726773103.65496: Set connection var ansible_shell_executable to /bin/sh 11381 1726773103.65503: Set connection var ansible_connection to ssh 11381 1726773103.65518: variable 'ansible_shell_executable' from source: unknown 11381 1726773103.65521: variable 'ansible_connection' from source: unknown 11381 1726773103.65525: variable 'ansible_module_compression' from source: unknown 11381 1726773103.65528: variable 'ansible_shell_type' from source: unknown 11381 1726773103.65531: variable 'ansible_shell_executable' from source: unknown 11381 1726773103.65534: variable 'ansible_host' from source: host vars for 'managed_node2' 11381 1726773103.65539: variable 'ansible_pipelining' from source: unknown 11381 1726773103.65542: variable 'ansible_timeout' from source: unknown 11381 1726773103.65546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11381 1726773103.65620: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11381 1726773103.65632: variable 'omit' from source: magic vars 11381 1726773103.65638: starting attempt loop 11381 1726773103.65641: running the handler 11381 1726773103.65680: handler run complete 11381 1726773103.65693: attempt loop complete, returning result 11381 1726773103.65696: _execute() done 11381 1726773103.65699: dumping result to json 11381 1726773103.65706: done dumping result, returning 11381 1726773103.65711: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-885f-bbcf-000000000f55] 11381 1726773103.65715: sending task result for task 0affffe7-6841-885f-bbcf-000000000f55 11381 1726773103.65736: done sending task result for task 0affffe7-6841-885f-bbcf-000000000f55 11381 1726773103.65738: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8240 1726773103.65913: no more pending results, returning what we have 8240 1726773103.65917: results queue empty 8240 1726773103.65918: checking for any_errors_fatal 8240 1726773103.65923: done checking for any_errors_fatal 8240 1726773103.65923: checking for max_fail_percentage 8240 1726773103.65924: done checking for max_fail_percentage 8240 1726773103.65925: checking to see if all hosts have failed and the running result is not ok 8240 1726773103.65926: done checking to see if all hosts have failed 8240 1726773103.65927: getting the remaining hosts for this loop 8240 1726773103.65928: done getting the remaining hosts for this loop 8240 1726773103.65931: getting the next task for host managed_node2 8240 1726773103.65941: done getting next task for host managed_node2 8240 1726773103.65944: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8240 1726773103.65947: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773103.65957: getting variables 8240 1726773103.65959: in VariableManager get_vars() 8240 1726773103.65992: Calling all_inventory to load vars for managed_node2 8240 1726773103.65994: Calling groups_inventory to load vars for managed_node2 8240 1726773103.65995: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773103.66007: Calling all_plugins_play to load vars for managed_node2 8240 1726773103.66009: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773103.66011: Calling groups_plugins_play to load vars for managed_node2 8240 1726773103.66151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773103.66264: done with get_vars() 8240 1726773103.66270: done getting variables 8240 1726773103.66312: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:11:43 -0400 (0:00:00.028) 0:01:22.307 **** 8240 1726773103.66335: entering _queue_task() for managed_node2/package 8240 1726773103.66484: worker is 1 (out of 1 available) 8240 1726773103.66500: exiting _queue_task() for managed_node2/package 8240 1726773103.66514: done queuing things up, now waiting for results queue to drain 8240 1726773103.66516: waiting for pending results... 11382 1726773103.66640: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 11382 1726773103.66756: in run() - task 0affffe7-6841-885f-bbcf-000000000ed3 11382 1726773103.66773: variable 'ansible_search_path' from source: unknown 11382 1726773103.66777: variable 'ansible_search_path' from source: unknown 11382 1726773103.66806: calling self._execute() 11382 1726773103.66876: variable 'ansible_host' from source: host vars for 'managed_node2' 11382 1726773103.66886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11382 1726773103.66895: variable 'omit' from source: magic vars 11382 1726773103.66968: variable 'omit' from source: magic vars 11382 1726773103.67009: variable 'omit' from source: magic vars 11382 1726773103.67031: variable '__kernel_settings_packages' from source: include_vars 11382 1726773103.67247: variable '__kernel_settings_packages' from source: include_vars 11382 1726773103.67407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11382 1726773103.68912: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11382 1726773103.68968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11382 1726773103.68999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11382 1726773103.69028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11382 1726773103.69049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11382 1726773103.69120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11382 1726773103.69142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11382 1726773103.69161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11382 1726773103.69192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11382 1726773103.69206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11382 1726773103.69277: variable '__kernel_settings_is_ostree' from source: set_fact 11382 1726773103.69284: variable 'omit' from source: magic vars 11382 1726773103.69311: variable 'omit' from source: magic vars 11382 1726773103.69332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11382 1726773103.69353: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11382 1726773103.69369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11382 1726773103.69383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11382 1726773103.69394: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11382 1726773103.69419: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11382 1726773103.69425: variable 'ansible_host' from source: host vars for 'managed_node2' 11382 1726773103.69429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11382 1726773103.69495: Set connection var ansible_pipelining to False 11382 1726773103.69505: Set connection var ansible_timeout to 10 11382 1726773103.69513: Set connection var ansible_module_compression to ZIP_DEFLATED 11382 1726773103.69517: Set connection var ansible_shell_type to sh 11382 1726773103.69522: Set connection var ansible_shell_executable to /bin/sh 11382 1726773103.69527: Set connection var ansible_connection to ssh 11382 1726773103.69543: variable 'ansible_shell_executable' from source: unknown 11382 1726773103.69546: variable 'ansible_connection' from source: unknown 11382 1726773103.69549: variable 'ansible_module_compression' from source: unknown 11382 1726773103.69553: variable 'ansible_shell_type' from source: unknown 11382 1726773103.69556: variable 'ansible_shell_executable' from source: unknown 11382 1726773103.69559: variable 'ansible_host' from source: host vars for 'managed_node2' 11382 1726773103.69563: variable 'ansible_pipelining' from source: unknown 11382 1726773103.69567: variable 'ansible_timeout' from source: unknown 11382 1726773103.69572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11382 1726773103.69635: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11382 1726773103.69647: variable 'omit' from source: magic vars 11382 1726773103.69653: starting attempt loop 11382 1726773103.69656: running the handler 11382 1726773103.69719: variable 'ansible_facts' from source: unknown 11382 1726773103.69800: _low_level_execute_command(): starting 11382 1726773103.69811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11382 1726773103.72139: stdout chunk (state=2): >>>/root <<< 11382 1726773103.72263: stderr chunk (state=3): >>><<< 11382 1726773103.72270: stdout chunk (state=3): >>><<< 11382 1726773103.72290: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11382 1726773103.72305: _low_level_execute_command(): starting 11382 1726773103.72311: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156 `" && echo ansible-tmp-1726773103.7229872-11382-267066699054156="` echo /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156 `" ) && sleep 0' 11382 1726773103.74902: stdout chunk (state=2): >>>ansible-tmp-1726773103.7229872-11382-267066699054156=/root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156 <<< 11382 1726773103.75037: stderr chunk (state=3): >>><<< 11382 1726773103.75044: stdout chunk (state=3): >>><<< 11382 1726773103.75060: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773103.7229872-11382-267066699054156=/root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156 , stderr= 11382 1726773103.75089: variable 'ansible_module_compression' from source: unknown 11382 1726773103.75141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11382 1726773103.75182: variable 'ansible_facts' from source: unknown 11382 1726773103.75276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/AnsiballZ_dnf.py 11382 1726773103.75383: Sending initial data 11382 1726773103.75391: Sent initial data (151 bytes) 11382 1726773103.77883: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp1la627lv /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/AnsiballZ_dnf.py <<< 11382 1726773103.79266: stderr chunk (state=3): >>><<< 11382 1726773103.79276: stdout chunk (state=3): >>><<< 11382 1726773103.79297: done transferring module to remote 11382 1726773103.79311: _low_level_execute_command(): starting 11382 1726773103.79316: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/ /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/AnsiballZ_dnf.py && sleep 0' 11382 1726773103.81691: stderr chunk (state=2): >>><<< 11382 1726773103.81705: stdout chunk (state=2): >>><<< 11382 1726773103.81721: _low_level_execute_command() done: rc=0, stdout=, stderr= 11382 1726773103.81725: _low_level_execute_command(): starting 11382 1726773103.81730: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/AnsiballZ_dnf.py && sleep 0' 11382 1726773106.35730: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 11382 1726773106.43563: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11382 1726773106.43614: stderr chunk (state=3): >>><<< 11382 1726773106.43621: stdout chunk (state=3): >>><<< 11382 1726773106.43637: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11382 1726773106.43671: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11382 1726773106.43679: _low_level_execute_command(): starting 11382 1726773106.43686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773103.7229872-11382-267066699054156/ > /dev/null 2>&1 && sleep 0' 11382 1726773106.46108: stderr chunk (state=2): >>><<< 11382 1726773106.46117: stdout chunk (state=2): >>><<< 11382 1726773106.46130: _low_level_execute_command() done: rc=0, stdout=, stderr= 11382 1726773106.46137: handler run complete 11382 1726773106.46165: attempt loop complete, returning result 11382 1726773106.46169: _execute() done 11382 1726773106.46172: dumping result to json 11382 1726773106.46178: done dumping result, returning 11382 1726773106.46186: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-885f-bbcf-000000000ed3] 11382 1726773106.46192: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed3 11382 1726773106.46223: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed3 11382 1726773106.46226: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8240 1726773106.46393: no more pending results, returning what we have 8240 1726773106.46398: results queue empty 8240 1726773106.46399: checking for any_errors_fatal 8240 1726773106.46408: done checking for any_errors_fatal 8240 1726773106.46409: checking for max_fail_percentage 8240 1726773106.46410: done checking for max_fail_percentage 8240 1726773106.46410: checking to see if all hosts have failed and the running result is not ok 8240 1726773106.46411: done checking to see if all hosts have failed 8240 1726773106.46412: getting the remaining hosts for this loop 8240 1726773106.46413: done getting the remaining hosts for this loop 8240 1726773106.46417: getting the next task for host managed_node2 8240 1726773106.46426: done getting next task for host managed_node2 8240 1726773106.46429: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8240 1726773106.46432: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773106.46443: getting variables 8240 1726773106.46445: in VariableManager get_vars() 8240 1726773106.46478: Calling all_inventory to load vars for managed_node2 8240 1726773106.46481: Calling groups_inventory to load vars for managed_node2 8240 1726773106.46482: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773106.46492: Calling all_plugins_play to load vars for managed_node2 8240 1726773106.46495: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773106.46496: Calling groups_plugins_play to load vars for managed_node2 8240 1726773106.46604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773106.46726: done with get_vars() 8240 1726773106.46735: done getting variables 8240 1726773106.46779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:11:46 -0400 (0:00:02.804) 0:01:25.111 **** 8240 1726773106.46810: entering _queue_task() for managed_node2/debug 8240 1726773106.46974: worker is 1 (out of 1 available) 8240 1726773106.46991: exiting _queue_task() for managed_node2/debug 8240 1726773106.47004: done queuing things up, now waiting for results queue to drain 8240 1726773106.47006: waiting for pending results... 11399 1726773106.47142: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 11399 1726773106.47264: in run() - task 0affffe7-6841-885f-bbcf-000000000ed5 11399 1726773106.47281: variable 'ansible_search_path' from source: unknown 11399 1726773106.47286: variable 'ansible_search_path' from source: unknown 11399 1726773106.47317: calling self._execute() 11399 1726773106.47388: variable 'ansible_host' from source: host vars for 'managed_node2' 11399 1726773106.47397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11399 1726773106.47408: variable 'omit' from source: magic vars 11399 1726773106.47761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11399 1726773106.49339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11399 1726773106.49388: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11399 1726773106.49427: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11399 1726773106.49457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11399 1726773106.49477: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11399 1726773106.49535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11399 1726773106.49557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11399 1726773106.49576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11399 1726773106.49605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11399 1726773106.49618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11399 1726773106.49694: variable '__kernel_settings_is_transactional' from source: set_fact 11399 1726773106.49711: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11399 1726773106.49715: when evaluation is False, skipping this task 11399 1726773106.49719: _execute() done 11399 1726773106.49723: dumping result to json 11399 1726773106.49726: done dumping result, returning 11399 1726773106.49732: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000ed5] 11399 1726773106.49737: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed5 11399 1726773106.49760: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed5 11399 1726773106.49763: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8240 1726773106.49870: no more pending results, returning what we have 8240 1726773106.49874: results queue empty 8240 1726773106.49875: checking for any_errors_fatal 8240 1726773106.49882: done checking for any_errors_fatal 8240 1726773106.49882: checking for max_fail_percentage 8240 1726773106.49884: done checking for max_fail_percentage 8240 1726773106.49884: checking to see if all hosts have failed and the running result is not ok 8240 1726773106.49887: done checking to see if all hosts have failed 8240 1726773106.49888: getting the remaining hosts for this loop 8240 1726773106.49889: done getting the remaining hosts for this loop 8240 1726773106.49892: getting the next task for host managed_node2 8240 1726773106.49899: done getting next task for host managed_node2 8240 1726773106.49905: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8240 1726773106.49909: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773106.49925: getting variables 8240 1726773106.49927: in VariableManager get_vars() 8240 1726773106.49960: Calling all_inventory to load vars for managed_node2 8240 1726773106.49963: Calling groups_inventory to load vars for managed_node2 8240 1726773106.49965: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773106.49975: Calling all_plugins_play to load vars for managed_node2 8240 1726773106.49978: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773106.49981: Calling groups_plugins_play to load vars for managed_node2 8240 1726773106.50098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773106.50267: done with get_vars() 8240 1726773106.50274: done getting variables 8240 1726773106.50318: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:11:46 -0400 (0:00:00.035) 0:01:25.147 **** 8240 1726773106.50342: entering _queue_task() for managed_node2/reboot 8240 1726773106.50507: worker is 1 (out of 1 available) 8240 1726773106.50522: exiting _queue_task() for managed_node2/reboot 8240 1726773106.50536: done queuing things up, now waiting for results queue to drain 8240 1726773106.50537: waiting for pending results... 11400 1726773106.50657: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 11400 1726773106.50783: in run() - task 0affffe7-6841-885f-bbcf-000000000ed6 11400 1726773106.50801: variable 'ansible_search_path' from source: unknown 11400 1726773106.50806: variable 'ansible_search_path' from source: unknown 11400 1726773106.50835: calling self._execute() 11400 1726773106.50905: variable 'ansible_host' from source: host vars for 'managed_node2' 11400 1726773106.50913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11400 1726773106.50922: variable 'omit' from source: magic vars 11400 1726773106.51262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11400 1726773106.52763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11400 1726773106.52813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11400 1726773106.52841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11400 1726773106.52868: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11400 1726773106.52890: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11400 1726773106.52945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11400 1726773106.52966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11400 1726773106.52984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11400 1726773106.53014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11400 1726773106.53026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11400 1726773106.53102: variable '__kernel_settings_is_transactional' from source: set_fact 11400 1726773106.53119: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11400 1726773106.53123: when evaluation is False, skipping this task 11400 1726773106.53126: _execute() done 11400 1726773106.53130: dumping result to json 11400 1726773106.53134: done dumping result, returning 11400 1726773106.53140: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-885f-bbcf-000000000ed6] 11400 1726773106.53145: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed6 11400 1726773106.53169: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed6 11400 1726773106.53173: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773106.53293: no more pending results, returning what we have 8240 1726773106.53296: results queue empty 8240 1726773106.53297: checking for any_errors_fatal 8240 1726773106.53306: done checking for any_errors_fatal 8240 1726773106.53307: checking for max_fail_percentage 8240 1726773106.53309: done checking for max_fail_percentage 8240 1726773106.53309: checking to see if all hosts have failed and the running result is not ok 8240 1726773106.53310: done checking to see if all hosts have failed 8240 1726773106.53311: getting the remaining hosts for this loop 8240 1726773106.53312: done getting the remaining hosts for this loop 8240 1726773106.53315: getting the next task for host managed_node2 8240 1726773106.53321: done getting next task for host managed_node2 8240 1726773106.53325: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8240 1726773106.53328: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773106.53344: getting variables 8240 1726773106.53346: in VariableManager get_vars() 8240 1726773106.53378: Calling all_inventory to load vars for managed_node2 8240 1726773106.53381: Calling groups_inventory to load vars for managed_node2 8240 1726773106.53383: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773106.53394: Calling all_plugins_play to load vars for managed_node2 8240 1726773106.53397: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773106.53399: Calling groups_plugins_play to load vars for managed_node2 8240 1726773106.53515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773106.53636: done with get_vars() 8240 1726773106.53644: done getting variables 8240 1726773106.53686: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:11:46 -0400 (0:00:00.033) 0:01:25.180 **** 8240 1726773106.53712: entering _queue_task() for managed_node2/fail 8240 1726773106.53870: worker is 1 (out of 1 available) 8240 1726773106.53884: exiting _queue_task() for managed_node2/fail 8240 1726773106.53899: done queuing things up, now waiting for results queue to drain 8240 1726773106.53903: waiting for pending results... 11401 1726773106.54026: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 11401 1726773106.54150: in run() - task 0affffe7-6841-885f-bbcf-000000000ed7 11401 1726773106.54167: variable 'ansible_search_path' from source: unknown 11401 1726773106.54171: variable 'ansible_search_path' from source: unknown 11401 1726773106.54200: calling self._execute() 11401 1726773106.54269: variable 'ansible_host' from source: host vars for 'managed_node2' 11401 1726773106.54277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11401 1726773106.54288: variable 'omit' from source: magic vars 11401 1726773106.54628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11401 1726773106.56177: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11401 1726773106.56229: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11401 1726773106.56266: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11401 1726773106.56295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11401 1726773106.56319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11401 1726773106.56373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11401 1726773106.56400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11401 1726773106.56421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11401 1726773106.56447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11401 1726773106.56459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11401 1726773106.56538: variable '__kernel_settings_is_transactional' from source: set_fact 11401 1726773106.56554: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 11401 1726773106.56558: when evaluation is False, skipping this task 11401 1726773106.56562: _execute() done 11401 1726773106.56566: dumping result to json 11401 1726773106.56570: done dumping result, returning 11401 1726773106.56576: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-885f-bbcf-000000000ed7] 11401 1726773106.56581: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed7 11401 1726773106.56608: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed7 11401 1726773106.56611: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8240 1726773106.56715: no more pending results, returning what we have 8240 1726773106.56718: results queue empty 8240 1726773106.56719: checking for any_errors_fatal 8240 1726773106.56726: done checking for any_errors_fatal 8240 1726773106.56727: checking for max_fail_percentage 8240 1726773106.56728: done checking for max_fail_percentage 8240 1726773106.56729: checking to see if all hosts have failed and the running result is not ok 8240 1726773106.56730: done checking to see if all hosts have failed 8240 1726773106.56731: getting the remaining hosts for this loop 8240 1726773106.56732: done getting the remaining hosts for this loop 8240 1726773106.56736: getting the next task for host managed_node2 8240 1726773106.56745: done getting next task for host managed_node2 8240 1726773106.56748: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8240 1726773106.56752: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773106.56769: getting variables 8240 1726773106.56771: in VariableManager get_vars() 8240 1726773106.56806: Calling all_inventory to load vars for managed_node2 8240 1726773106.56809: Calling groups_inventory to load vars for managed_node2 8240 1726773106.56811: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773106.56821: Calling all_plugins_play to load vars for managed_node2 8240 1726773106.56824: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773106.56826: Calling groups_plugins_play to load vars for managed_node2 8240 1726773106.56987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773106.57106: done with get_vars() 8240 1726773106.57114: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:11:46 -0400 (0:00:00.034) 0:01:25.215 **** 8240 1726773106.57170: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773106.57329: worker is 1 (out of 1 available) 8240 1726773106.57342: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773106.57357: done queuing things up, now waiting for results queue to drain 8240 1726773106.57359: waiting for pending results... 11402 1726773106.57491: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 11402 1726773106.57619: in run() - task 0affffe7-6841-885f-bbcf-000000000ed9 11402 1726773106.57636: variable 'ansible_search_path' from source: unknown 11402 1726773106.57640: variable 'ansible_search_path' from source: unknown 11402 1726773106.57667: calling self._execute() 11402 1726773106.57739: variable 'ansible_host' from source: host vars for 'managed_node2' 11402 1726773106.57748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11402 1726773106.57757: variable 'omit' from source: magic vars 11402 1726773106.57834: variable 'omit' from source: magic vars 11402 1726773106.57872: variable 'omit' from source: magic vars 11402 1726773106.57896: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11402 1726773106.58115: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 11402 1726773106.58174: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11402 1726773106.58208: variable 'omit' from source: magic vars 11402 1726773106.58241: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11402 1726773106.58267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11402 1726773106.58288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11402 1726773106.58304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11402 1726773106.58316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11402 1726773106.58340: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11402 1726773106.58345: variable 'ansible_host' from source: host vars for 'managed_node2' 11402 1726773106.58348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11402 1726773106.58424: Set connection var ansible_pipelining to False 11402 1726773106.58431: Set connection var ansible_timeout to 10 11402 1726773106.58440: Set connection var ansible_module_compression to ZIP_DEFLATED 11402 1726773106.58443: Set connection var ansible_shell_type to sh 11402 1726773106.58448: Set connection var ansible_shell_executable to /bin/sh 11402 1726773106.58453: Set connection var ansible_connection to ssh 11402 1726773106.58466: variable 'ansible_shell_executable' from source: unknown 11402 1726773106.58469: variable 'ansible_connection' from source: unknown 11402 1726773106.58471: variable 'ansible_module_compression' from source: unknown 11402 1726773106.58472: variable 'ansible_shell_type' from source: unknown 11402 1726773106.58474: variable 'ansible_shell_executable' from source: unknown 11402 1726773106.58475: variable 'ansible_host' from source: host vars for 'managed_node2' 11402 1726773106.58477: variable 'ansible_pipelining' from source: unknown 11402 1726773106.58479: variable 'ansible_timeout' from source: unknown 11402 1726773106.58481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11402 1726773106.58607: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11402 1726773106.58616: variable 'omit' from source: magic vars 11402 1726773106.58620: starting attempt loop 11402 1726773106.58622: running the handler 11402 1726773106.58631: _low_level_execute_command(): starting 11402 1726773106.58637: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11402 1726773106.60962: stdout chunk (state=2): >>>/root <<< 11402 1726773106.61077: stderr chunk (state=3): >>><<< 11402 1726773106.61083: stdout chunk (state=3): >>><<< 11402 1726773106.61102: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11402 1726773106.61116: _low_level_execute_command(): starting 11402 1726773106.61121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536 `" && echo ansible-tmp-1726773106.6111002-11402-274574834124536="` echo /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536 `" ) && sleep 0' 11402 1726773106.63671: stdout chunk (state=2): >>>ansible-tmp-1726773106.6111002-11402-274574834124536=/root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536 <<< 11402 1726773106.63805: stderr chunk (state=3): >>><<< 11402 1726773106.63812: stdout chunk (state=3): >>><<< 11402 1726773106.63827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773106.6111002-11402-274574834124536=/root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536 , stderr= 11402 1726773106.63862: variable 'ansible_module_compression' from source: unknown 11402 1726773106.63897: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11402 1726773106.63935: variable 'ansible_facts' from source: unknown 11402 1726773106.63995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/AnsiballZ_kernel_settings_get_config.py 11402 1726773106.64093: Sending initial data 11402 1726773106.64100: Sent initial data (174 bytes) 11402 1726773106.66584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmph1c1mulo /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/AnsiballZ_kernel_settings_get_config.py <<< 11402 1726773106.67662: stderr chunk (state=3): >>><<< 11402 1726773106.67671: stdout chunk (state=3): >>><<< 11402 1726773106.67693: done transferring module to remote 11402 1726773106.67705: _low_level_execute_command(): starting 11402 1726773106.67710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/ /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11402 1726773106.70048: stderr chunk (state=2): >>><<< 11402 1726773106.70056: stdout chunk (state=2): >>><<< 11402 1726773106.70070: _low_level_execute_command() done: rc=0, stdout=, stderr= 11402 1726773106.70075: _low_level_execute_command(): starting 11402 1726773106.70080: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11402 1726773106.85632: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 11402 1726773106.86742: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11402 1726773106.86791: stderr chunk (state=3): >>><<< 11402 1726773106.86797: stdout chunk (state=3): >>><<< 11402 1726773106.86815: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 11402 1726773106.86846: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11402 1726773106.86858: _low_level_execute_command(): starting 11402 1726773106.86863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773106.6111002-11402-274574834124536/ > /dev/null 2>&1 && sleep 0' 11402 1726773106.89337: stderr chunk (state=2): >>><<< 11402 1726773106.89346: stdout chunk (state=2): >>><<< 11402 1726773106.89361: _low_level_execute_command() done: rc=0, stdout=, stderr= 11402 1726773106.89368: handler run complete 11402 1726773106.89383: attempt loop complete, returning result 11402 1726773106.89388: _execute() done 11402 1726773106.89392: dumping result to json 11402 1726773106.89396: done dumping result, returning 11402 1726773106.89404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-885f-bbcf-000000000ed9] 11402 1726773106.89412: sending task result for task 0affffe7-6841-885f-bbcf-000000000ed9 11402 1726773106.89442: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ed9 11402 1726773106.89446: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8240 1726773106.89594: no more pending results, returning what we have 8240 1726773106.89598: results queue empty 8240 1726773106.89599: checking for any_errors_fatal 8240 1726773106.89608: done checking for any_errors_fatal 8240 1726773106.89609: checking for max_fail_percentage 8240 1726773106.89611: done checking for max_fail_percentage 8240 1726773106.89611: checking to see if all hosts have failed and the running result is not ok 8240 1726773106.89612: done checking to see if all hosts have failed 8240 1726773106.89613: getting the remaining hosts for this loop 8240 1726773106.89614: done getting the remaining hosts for this loop 8240 1726773106.89617: getting the next task for host managed_node2 8240 1726773106.89624: done getting next task for host managed_node2 8240 1726773106.89626: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8240 1726773106.89630: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773106.89640: getting variables 8240 1726773106.89642: in VariableManager get_vars() 8240 1726773106.89675: Calling all_inventory to load vars for managed_node2 8240 1726773106.89678: Calling groups_inventory to load vars for managed_node2 8240 1726773106.89680: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773106.89691: Calling all_plugins_play to load vars for managed_node2 8240 1726773106.89694: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773106.89697: Calling groups_plugins_play to load vars for managed_node2 8240 1726773106.89817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773106.89939: done with get_vars() 8240 1726773106.89948: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:11:46 -0400 (0:00:00.328) 0:01:25.544 **** 8240 1726773106.90021: entering _queue_task() for managed_node2/stat 8240 1726773106.90194: worker is 1 (out of 1 available) 8240 1726773106.90211: exiting _queue_task() for managed_node2/stat 8240 1726773106.90223: done queuing things up, now waiting for results queue to drain 8240 1726773106.90225: waiting for pending results... 11410 1726773106.90355: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 11410 1726773106.90479: in run() - task 0affffe7-6841-885f-bbcf-000000000eda 11410 1726773106.90497: variable 'ansible_search_path' from source: unknown 11410 1726773106.90501: variable 'ansible_search_path' from source: unknown 11410 1726773106.90540: variable '__prof_from_conf' from source: task vars 11410 1726773106.90778: variable '__prof_from_conf' from source: task vars 11410 1726773106.90918: variable '__data' from source: task vars 11410 1726773106.90970: variable '__kernel_settings_register_tuned_main' from source: set_fact 11410 1726773106.91117: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11410 1726773106.91128: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11410 1726773106.91169: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11410 1726773106.91261: variable 'omit' from source: magic vars 11410 1726773106.91335: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773106.91345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773106.91353: variable 'omit' from source: magic vars 11410 1726773106.91526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11410 1726773106.93017: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11410 1726773106.93071: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11410 1726773106.93104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11410 1726773106.93132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11410 1726773106.93153: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11410 1726773106.93210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11410 1726773106.93231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11410 1726773106.93249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11410 1726773106.93276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11410 1726773106.93289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11410 1726773106.93357: variable 'item' from source: unknown 11410 1726773106.93370: Evaluated conditional (item | length > 0): False 11410 1726773106.93374: when evaluation is False, skipping this task 11410 1726773106.93399: variable 'item' from source: unknown 11410 1726773106.93450: variable 'item' from source: unknown skipping: [managed_node2] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 11410 1726773106.93528: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773106.93538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773106.93546: variable 'omit' from source: magic vars 11410 1726773106.93665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11410 1726773106.93684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11410 1726773106.93705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11410 1726773106.93732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11410 1726773106.93744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11410 1726773106.93797: variable 'item' from source: unknown 11410 1726773106.93808: Evaluated conditional (item | length > 0): True 11410 1726773106.93815: variable 'omit' from source: magic vars 11410 1726773106.93847: variable 'omit' from source: magic vars 11410 1726773106.93878: variable 'item' from source: unknown 11410 1726773106.93925: variable 'item' from source: unknown 11410 1726773106.93938: variable 'omit' from source: magic vars 11410 1726773106.93958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11410 1726773106.93979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11410 1726773106.93997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11410 1726773106.94012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11410 1726773106.94022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11410 1726773106.94044: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11410 1726773106.94048: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773106.94052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773106.94118: Set connection var ansible_pipelining to False 11410 1726773106.94126: Set connection var ansible_timeout to 10 11410 1726773106.94133: Set connection var ansible_module_compression to ZIP_DEFLATED 11410 1726773106.94137: Set connection var ansible_shell_type to sh 11410 1726773106.94142: Set connection var ansible_shell_executable to /bin/sh 11410 1726773106.94147: Set connection var ansible_connection to ssh 11410 1726773106.94160: variable 'ansible_shell_executable' from source: unknown 11410 1726773106.94164: variable 'ansible_connection' from source: unknown 11410 1726773106.94168: variable 'ansible_module_compression' from source: unknown 11410 1726773106.94171: variable 'ansible_shell_type' from source: unknown 11410 1726773106.94174: variable 'ansible_shell_executable' from source: unknown 11410 1726773106.94178: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773106.94182: variable 'ansible_pipelining' from source: unknown 11410 1726773106.94187: variable 'ansible_timeout' from source: unknown 11410 1726773106.94191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773106.94279: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11410 1726773106.94291: variable 'omit' from source: magic vars 11410 1726773106.94297: starting attempt loop 11410 1726773106.94300: running the handler 11410 1726773106.94314: _low_level_execute_command(): starting 11410 1726773106.94321: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11410 1726773106.96640: stdout chunk (state=2): >>>/root <<< 11410 1726773106.96763: stderr chunk (state=3): >>><<< 11410 1726773106.96770: stdout chunk (state=3): >>><<< 11410 1726773106.96791: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11410 1726773106.96806: _low_level_execute_command(): starting 11410 1726773106.96812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715 `" && echo ansible-tmp-1726773106.967994-11410-133076790498715="` echo /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715 `" ) && sleep 0' 11410 1726773106.99569: stdout chunk (state=2): >>>ansible-tmp-1726773106.967994-11410-133076790498715=/root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715 <<< 11410 1726773106.99706: stderr chunk (state=3): >>><<< 11410 1726773106.99713: stdout chunk (state=3): >>><<< 11410 1726773106.99730: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773106.967994-11410-133076790498715=/root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715 , stderr= 11410 1726773106.99766: variable 'ansible_module_compression' from source: unknown 11410 1726773106.99810: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11410 1726773106.99843: variable 'ansible_facts' from source: unknown 11410 1726773106.99903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/AnsiballZ_stat.py 11410 1726773107.00003: Sending initial data 11410 1726773107.00010: Sent initial data (151 bytes) 11410 1726773107.02590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpjtbier24 /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/AnsiballZ_stat.py <<< 11410 1726773107.03690: stderr chunk (state=3): >>><<< 11410 1726773107.03700: stdout chunk (state=3): >>><<< 11410 1726773107.03724: done transferring module to remote 11410 1726773107.03735: _low_level_execute_command(): starting 11410 1726773107.03740: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/ /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/AnsiballZ_stat.py && sleep 0' 11410 1726773107.06114: stderr chunk (state=2): >>><<< 11410 1726773107.06124: stdout chunk (state=2): >>><<< 11410 1726773107.06140: _low_level_execute_command() done: rc=0, stdout=, stderr= 11410 1726773107.06144: _low_level_execute_command(): starting 11410 1726773107.06151: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/AnsiballZ_stat.py && sleep 0' 11410 1726773107.21092: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11410 1726773107.22094: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11410 1726773107.22145: stderr chunk (state=3): >>><<< 11410 1726773107.22152: stdout chunk (state=3): >>><<< 11410 1726773107.22167: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 11410 1726773107.22190: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11410 1726773107.22203: _low_level_execute_command(): starting 11410 1726773107.22209: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773106.967994-11410-133076790498715/ > /dev/null 2>&1 && sleep 0' 11410 1726773107.24626: stderr chunk (state=2): >>><<< 11410 1726773107.24637: stdout chunk (state=2): >>><<< 11410 1726773107.24651: _low_level_execute_command() done: rc=0, stdout=, stderr= 11410 1726773107.24658: handler run complete 11410 1726773107.24673: attempt loop complete, returning result 11410 1726773107.24690: variable 'item' from source: unknown 11410 1726773107.24753: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 11410 1726773107.24844: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773107.24854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773107.24864: variable 'omit' from source: magic vars 11410 1726773107.24969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11410 1726773107.24993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11410 1726773107.25015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11410 1726773107.25041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11410 1726773107.25053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11410 1726773107.25115: variable 'item' from source: unknown 11410 1726773107.25124: Evaluated conditional (item | length > 0): True 11410 1726773107.25130: variable 'omit' from source: magic vars 11410 1726773107.25141: variable 'omit' from source: magic vars 11410 1726773107.25172: variable 'item' from source: unknown 11410 1726773107.25220: variable 'item' from source: unknown 11410 1726773107.25234: variable 'omit' from source: magic vars 11410 1726773107.25250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11410 1726773107.25259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11410 1726773107.25265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11410 1726773107.25277: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11410 1726773107.25281: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773107.25287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773107.25338: Set connection var ansible_pipelining to False 11410 1726773107.25344: Set connection var ansible_timeout to 10 11410 1726773107.25353: Set connection var ansible_module_compression to ZIP_DEFLATED 11410 1726773107.25356: Set connection var ansible_shell_type to sh 11410 1726773107.25361: Set connection var ansible_shell_executable to /bin/sh 11410 1726773107.25365: Set connection var ansible_connection to ssh 11410 1726773107.25379: variable 'ansible_shell_executable' from source: unknown 11410 1726773107.25383: variable 'ansible_connection' from source: unknown 11410 1726773107.25387: variable 'ansible_module_compression' from source: unknown 11410 1726773107.25391: variable 'ansible_shell_type' from source: unknown 11410 1726773107.25394: variable 'ansible_shell_executable' from source: unknown 11410 1726773107.25398: variable 'ansible_host' from source: host vars for 'managed_node2' 11410 1726773107.25404: variable 'ansible_pipelining' from source: unknown 11410 1726773107.25407: variable 'ansible_timeout' from source: unknown 11410 1726773107.25411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11410 1726773107.25474: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11410 1726773107.25484: variable 'omit' from source: magic vars 11410 1726773107.25491: starting attempt loop 11410 1726773107.25495: running the handler 11410 1726773107.25504: _low_level_execute_command(): starting 11410 1726773107.25508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11410 1726773107.27662: stdout chunk (state=2): >>>/root <<< 11410 1726773107.27781: stderr chunk (state=3): >>><<< 11410 1726773107.27789: stdout chunk (state=3): >>><<< 11410 1726773107.27804: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11410 1726773107.27813: _low_level_execute_command(): starting 11410 1726773107.27818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118 `" && echo ansible-tmp-1726773107.2780986-11410-43658754641118="` echo /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118 `" ) && sleep 0' 11410 1726773107.30318: stdout chunk (state=2): >>>ansible-tmp-1726773107.2780986-11410-43658754641118=/root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118 <<< 11410 1726773107.30450: stderr chunk (state=3): >>><<< 11410 1726773107.30457: stdout chunk (state=3): >>><<< 11410 1726773107.30471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773107.2780986-11410-43658754641118=/root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118 , stderr= 11410 1726773107.30505: variable 'ansible_module_compression' from source: unknown 11410 1726773107.30539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11410 1726773107.30559: variable 'ansible_facts' from source: unknown 11410 1726773107.30613: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/AnsiballZ_stat.py 11410 1726773107.30704: Sending initial data 11410 1726773107.30711: Sent initial data (151 bytes) 11410 1726773107.33189: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp4vf455xq /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/AnsiballZ_stat.py <<< 11410 1726773107.34282: stderr chunk (state=3): >>><<< 11410 1726773107.34292: stdout chunk (state=3): >>><<< 11410 1726773107.34312: done transferring module to remote 11410 1726773107.34322: _low_level_execute_command(): starting 11410 1726773107.34327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/ /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/AnsiballZ_stat.py && sleep 0' 11410 1726773107.36719: stderr chunk (state=2): >>><<< 11410 1726773107.36729: stdout chunk (state=2): >>><<< 11410 1726773107.36744: _low_level_execute_command() done: rc=0, stdout=, stderr= 11410 1726773107.36749: _low_level_execute_command(): starting 11410 1726773107.36754: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/AnsiballZ_stat.py && sleep 0' 11410 1726773107.52234: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11410 1726773107.53357: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11410 1726773107.53408: stderr chunk (state=3): >>><<< 11410 1726773107.53415: stdout chunk (state=3): >>><<< 11410 1726773107.53431: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 159, "inode": 917919, "dev": 51713, "nlink": 4, "atime": 1726773042.2211215, "mtime": 1726773040.2991023, "ctime": 1726773040.2991023, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.9.64 closed. 11410 1726773107.53718: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11410 1726773107.53728: _low_level_execute_command(): starting 11410 1726773107.53734: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773107.2780986-11410-43658754641118/ > /dev/null 2>&1 && sleep 0' 11410 1726773107.56156: stderr chunk (state=2): >>><<< 11410 1726773107.56165: stdout chunk (state=2): >>><<< 11410 1726773107.56180: _low_level_execute_command() done: rc=0, stdout=, stderr= 11410 1726773107.56187: handler run complete 11410 1726773107.56220: attempt loop complete, returning result 11410 1726773107.56236: variable 'item' from source: unknown 11410 1726773107.56298: variable 'item' from source: unknown ok: [managed_node2] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726773042.2211215, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1726773040.2991023, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1726773040.2991023, "nlink": 4, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 159, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11410 1726773107.56343: dumping result to json 11410 1726773107.56353: done dumping result, returning 11410 1726773107.56360: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-885f-bbcf-000000000eda] 11410 1726773107.56366: sending task result for task 0affffe7-6841-885f-bbcf-000000000eda 11410 1726773107.56408: done sending task result for task 0affffe7-6841-885f-bbcf-000000000eda 11410 1726773107.56412: WORKER PROCESS EXITING 8240 1726773107.56678: no more pending results, returning what we have 8240 1726773107.56682: results queue empty 8240 1726773107.56683: checking for any_errors_fatal 8240 1726773107.56690: done checking for any_errors_fatal 8240 1726773107.56691: checking for max_fail_percentage 8240 1726773107.56692: done checking for max_fail_percentage 8240 1726773107.56693: checking to see if all hosts have failed and the running result is not ok 8240 1726773107.56694: done checking to see if all hosts have failed 8240 1726773107.56695: getting the remaining hosts for this loop 8240 1726773107.56696: done getting the remaining hosts for this loop 8240 1726773107.56699: getting the next task for host managed_node2 8240 1726773107.56706: done getting next task for host managed_node2 8240 1726773107.56709: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8240 1726773107.56712: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773107.56719: getting variables 8240 1726773107.56721: in VariableManager get_vars() 8240 1726773107.56742: Calling all_inventory to load vars for managed_node2 8240 1726773107.56744: Calling groups_inventory to load vars for managed_node2 8240 1726773107.56745: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773107.56752: Calling all_plugins_play to load vars for managed_node2 8240 1726773107.56754: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773107.56756: Calling groups_plugins_play to load vars for managed_node2 8240 1726773107.56856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773107.56972: done with get_vars() 8240 1726773107.56979: done getting variables 8240 1726773107.57025: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.670) 0:01:26.214 **** 8240 1726773107.57048: entering _queue_task() for managed_node2/set_fact 8240 1726773107.57219: worker is 1 (out of 1 available) 8240 1726773107.57233: exiting _queue_task() for managed_node2/set_fact 8240 1726773107.57247: done queuing things up, now waiting for results queue to drain 8240 1726773107.57249: waiting for pending results... 11428 1726773107.57378: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 11428 1726773107.57497: in run() - task 0affffe7-6841-885f-bbcf-000000000edb 11428 1726773107.57514: variable 'ansible_search_path' from source: unknown 11428 1726773107.57518: variable 'ansible_search_path' from source: unknown 11428 1726773107.57546: calling self._execute() 11428 1726773107.57619: variable 'ansible_host' from source: host vars for 'managed_node2' 11428 1726773107.57627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11428 1726773107.57636: variable 'omit' from source: magic vars 11428 1726773107.57718: variable 'omit' from source: magic vars 11428 1726773107.57758: variable 'omit' from source: magic vars 11428 1726773107.58082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11428 1726773107.59611: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11428 1726773107.59658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11428 1726773107.59689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11428 1726773107.59957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11428 1726773107.59980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11428 1726773107.60036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11428 1726773107.60058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11428 1726773107.60077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11428 1726773107.60107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11428 1726773107.60119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11428 1726773107.60151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11428 1726773107.60168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11428 1726773107.60189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11428 1726773107.60215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11428 1726773107.60233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11428 1726773107.60271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11428 1726773107.60291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11428 1726773107.60311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11428 1726773107.60336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11428 1726773107.60347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11428 1726773107.60492: variable '__kernel_settings_find_profile_dirs' from source: set_fact 11428 1726773107.60554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11428 1726773107.60662: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11428 1726773107.60692: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11428 1726773107.60716: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11428 1726773107.60739: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11428 1726773107.60768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11428 1726773107.60787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11428 1726773107.60806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11428 1726773107.60824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11428 1726773107.60861: variable 'omit' from source: magic vars 11428 1726773107.60882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11428 1726773107.60905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11428 1726773107.60921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11428 1726773107.60935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11428 1726773107.60945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11428 1726773107.60968: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11428 1726773107.60973: variable 'ansible_host' from source: host vars for 'managed_node2' 11428 1726773107.60978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11428 1726773107.61044: Set connection var ansible_pipelining to False 11428 1726773107.61051: Set connection var ansible_timeout to 10 11428 1726773107.61059: Set connection var ansible_module_compression to ZIP_DEFLATED 11428 1726773107.61063: Set connection var ansible_shell_type to sh 11428 1726773107.61068: Set connection var ansible_shell_executable to /bin/sh 11428 1726773107.61073: Set connection var ansible_connection to ssh 11428 1726773107.61091: variable 'ansible_shell_executable' from source: unknown 11428 1726773107.61095: variable 'ansible_connection' from source: unknown 11428 1726773107.61098: variable 'ansible_module_compression' from source: unknown 11428 1726773107.61102: variable 'ansible_shell_type' from source: unknown 11428 1726773107.61105: variable 'ansible_shell_executable' from source: unknown 11428 1726773107.61108: variable 'ansible_host' from source: host vars for 'managed_node2' 11428 1726773107.61112: variable 'ansible_pipelining' from source: unknown 11428 1726773107.61115: variable 'ansible_timeout' from source: unknown 11428 1726773107.61120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11428 1726773107.61180: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11428 1726773107.61193: variable 'omit' from source: magic vars 11428 1726773107.61198: starting attempt loop 11428 1726773107.61202: running the handler 11428 1726773107.61212: handler run complete 11428 1726773107.61221: attempt loop complete, returning result 11428 1726773107.61224: _execute() done 11428 1726773107.61226: dumping result to json 11428 1726773107.61230: done dumping result, returning 11428 1726773107.61236: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-885f-bbcf-000000000edb] 11428 1726773107.61242: sending task result for task 0affffe7-6841-885f-bbcf-000000000edb 11428 1726773107.61260: done sending task result for task 0affffe7-6841-885f-bbcf-000000000edb 11428 1726773107.61263: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8240 1726773107.61392: no more pending results, returning what we have 8240 1726773107.61396: results queue empty 8240 1726773107.61397: checking for any_errors_fatal 8240 1726773107.61412: done checking for any_errors_fatal 8240 1726773107.61413: checking for max_fail_percentage 8240 1726773107.61414: done checking for max_fail_percentage 8240 1726773107.61415: checking to see if all hosts have failed and the running result is not ok 8240 1726773107.61416: done checking to see if all hosts have failed 8240 1726773107.61417: getting the remaining hosts for this loop 8240 1726773107.61418: done getting the remaining hosts for this loop 8240 1726773107.61421: getting the next task for host managed_node2 8240 1726773107.61427: done getting next task for host managed_node2 8240 1726773107.61430: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8240 1726773107.61433: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773107.61444: getting variables 8240 1726773107.61445: in VariableManager get_vars() 8240 1726773107.61479: Calling all_inventory to load vars for managed_node2 8240 1726773107.61482: Calling groups_inventory to load vars for managed_node2 8240 1726773107.61483: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773107.61495: Calling all_plugins_play to load vars for managed_node2 8240 1726773107.61498: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773107.61500: Calling groups_plugins_play to load vars for managed_node2 8240 1726773107.61620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773107.61767: done with get_vars() 8240 1726773107.61774: done getting variables 8240 1726773107.61819: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:11:47 -0400 (0:00:00.047) 0:01:26.262 **** 8240 1726773107.61844: entering _queue_task() for managed_node2/service 8240 1726773107.62011: worker is 1 (out of 1 available) 8240 1726773107.62024: exiting _queue_task() for managed_node2/service 8240 1726773107.62039: done queuing things up, now waiting for results queue to drain 8240 1726773107.62041: waiting for pending results... 11429 1726773107.62169: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 11429 1726773107.62300: in run() - task 0affffe7-6841-885f-bbcf-000000000edc 11429 1726773107.62316: variable 'ansible_search_path' from source: unknown 11429 1726773107.62320: variable 'ansible_search_path' from source: unknown 11429 1726773107.62354: variable '__kernel_settings_services' from source: include_vars 11429 1726773107.62594: variable '__kernel_settings_services' from source: include_vars 11429 1726773107.62655: variable 'omit' from source: magic vars 11429 1726773107.62748: variable 'ansible_host' from source: host vars for 'managed_node2' 11429 1726773107.62759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11429 1726773107.62767: variable 'omit' from source: magic vars 11429 1726773107.62826: variable 'omit' from source: magic vars 11429 1726773107.62862: variable 'omit' from source: magic vars 11429 1726773107.62895: variable 'item' from source: unknown 11429 1726773107.62954: variable 'item' from source: unknown 11429 1726773107.62974: variable 'omit' from source: magic vars 11429 1726773107.63007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11429 1726773107.63034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11429 1726773107.63053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11429 1726773107.63067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11429 1726773107.63077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11429 1726773107.63103: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11429 1726773107.63108: variable 'ansible_host' from source: host vars for 'managed_node2' 11429 1726773107.63113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11429 1726773107.63180: Set connection var ansible_pipelining to False 11429 1726773107.63189: Set connection var ansible_timeout to 10 11429 1726773107.63197: Set connection var ansible_module_compression to ZIP_DEFLATED 11429 1726773107.63200: Set connection var ansible_shell_type to sh 11429 1726773107.63206: Set connection var ansible_shell_executable to /bin/sh 11429 1726773107.63211: Set connection var ansible_connection to ssh 11429 1726773107.63225: variable 'ansible_shell_executable' from source: unknown 11429 1726773107.63228: variable 'ansible_connection' from source: unknown 11429 1726773107.63233: variable 'ansible_module_compression' from source: unknown 11429 1726773107.63236: variable 'ansible_shell_type' from source: unknown 11429 1726773107.63240: variable 'ansible_shell_executable' from source: unknown 11429 1726773107.63242: variable 'ansible_host' from source: host vars for 'managed_node2' 11429 1726773107.63244: variable 'ansible_pipelining' from source: unknown 11429 1726773107.63246: variable 'ansible_timeout' from source: unknown 11429 1726773107.63248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11429 1726773107.63340: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11429 1726773107.63353: variable 'omit' from source: magic vars 11429 1726773107.63360: starting attempt loop 11429 1726773107.63363: running the handler 11429 1726773107.63428: variable 'ansible_facts' from source: unknown 11429 1726773107.63519: _low_level_execute_command(): starting 11429 1726773107.63529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11429 1726773107.65888: stdout chunk (state=2): >>>/root <<< 11429 1726773107.66013: stderr chunk (state=3): >>><<< 11429 1726773107.66021: stdout chunk (state=3): >>><<< 11429 1726773107.66041: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11429 1726773107.66055: _low_level_execute_command(): starting 11429 1726773107.66062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368 `" && echo ansible-tmp-1726773107.6605027-11429-61023336337368="` echo /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368 `" ) && sleep 0' 11429 1726773107.68661: stdout chunk (state=2): >>>ansible-tmp-1726773107.6605027-11429-61023336337368=/root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368 <<< 11429 1726773107.68783: stderr chunk (state=3): >>><<< 11429 1726773107.68791: stdout chunk (state=3): >>><<< 11429 1726773107.68808: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773107.6605027-11429-61023336337368=/root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368 , stderr= 11429 1726773107.68835: variable 'ansible_module_compression' from source: unknown 11429 1726773107.68877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11429 1726773107.68931: variable 'ansible_facts' from source: unknown 11429 1726773107.69088: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/AnsiballZ_systemd.py 11429 1726773107.69195: Sending initial data 11429 1726773107.69205: Sent initial data (154 bytes) 11429 1726773107.71672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpiqpcq90b /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/AnsiballZ_systemd.py <<< 11429 1726773107.73650: stderr chunk (state=3): >>><<< 11429 1726773107.73659: stdout chunk (state=3): >>><<< 11429 1726773107.73681: done transferring module to remote 11429 1726773107.73694: _low_level_execute_command(): starting 11429 1726773107.73700: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/ /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/AnsiballZ_systemd.py && sleep 0' 11429 1726773107.76089: stderr chunk (state=2): >>><<< 11429 1726773107.76098: stdout chunk (state=2): >>><<< 11429 1726773107.76114: _low_level_execute_command() done: rc=0, stdout=, stderr= 11429 1726773107.76119: _low_level_execute_command(): starting 11429 1726773107.76124: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/AnsiballZ_systemd.py && sleep 0' 11429 1726773108.03860: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21049344", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "<<< 11429 1726773108.03904: stdout chunk (state=3): >>>infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChange<<< 11429 1726773108.03914: stdout chunk (state=3): >>>TimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11429 1726773108.05499: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11429 1726773108.05547: stderr chunk (state=3): >>><<< 11429 1726773108.05554: stdout chunk (state=3): >>><<< 11429 1726773108.05574: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21049344", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11429 1726773108.05702: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11429 1726773108.05722: _low_level_execute_command(): starting 11429 1726773108.05728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773107.6605027-11429-61023336337368/ > /dev/null 2>&1 && sleep 0' 11429 1726773108.08138: stderr chunk (state=2): >>><<< 11429 1726773108.08147: stdout chunk (state=2): >>><<< 11429 1726773108.08162: _low_level_execute_command() done: rc=0, stdout=, stderr= 11429 1726773108.08169: handler run complete 11429 1726773108.08206: attempt loop complete, returning result 11429 1726773108.08223: variable 'item' from source: unknown 11429 1726773108.08282: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "21049344", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "WatchdogUSec": "0" } } 11429 1726773108.08387: dumping result to json 11429 1726773108.08405: done dumping result, returning 11429 1726773108.08414: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-885f-bbcf-000000000edc] 11429 1726773108.08420: sending task result for task 0affffe7-6841-885f-bbcf-000000000edc 11429 1726773108.08525: done sending task result for task 0affffe7-6841-885f-bbcf-000000000edc 11429 1726773108.08529: WORKER PROCESS EXITING 8240 1726773108.08872: no more pending results, returning what we have 8240 1726773108.08875: results queue empty 8240 1726773108.08875: checking for any_errors_fatal 8240 1726773108.08879: done checking for any_errors_fatal 8240 1726773108.08880: checking for max_fail_percentage 8240 1726773108.08880: done checking for max_fail_percentage 8240 1726773108.08881: checking to see if all hosts have failed and the running result is not ok 8240 1726773108.08882: done checking to see if all hosts have failed 8240 1726773108.08882: getting the remaining hosts for this loop 8240 1726773108.08883: done getting the remaining hosts for this loop 8240 1726773108.08887: getting the next task for host managed_node2 8240 1726773108.08892: done getting next task for host managed_node2 8240 1726773108.08894: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8240 1726773108.08897: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773108.08906: getting variables 8240 1726773108.08907: in VariableManager get_vars() 8240 1726773108.08930: Calling all_inventory to load vars for managed_node2 8240 1726773108.08932: Calling groups_inventory to load vars for managed_node2 8240 1726773108.08933: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773108.08940: Calling all_plugins_play to load vars for managed_node2 8240 1726773108.08942: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773108.08944: Calling groups_plugins_play to load vars for managed_node2 8240 1726773108.09058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773108.09175: done with get_vars() 8240 1726773108.09183: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.474) 0:01:26.736 **** 8240 1726773108.09252: entering _queue_task() for managed_node2/file 8240 1726773108.09418: worker is 1 (out of 1 available) 8240 1726773108.09433: exiting _queue_task() for managed_node2/file 8240 1726773108.09446: done queuing things up, now waiting for results queue to drain 8240 1726773108.09448: waiting for pending results... 11437 1726773108.09584: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 11437 1726773108.09717: in run() - task 0affffe7-6841-885f-bbcf-000000000edd 11437 1726773108.09734: variable 'ansible_search_path' from source: unknown 11437 1726773108.09738: variable 'ansible_search_path' from source: unknown 11437 1726773108.09765: calling self._execute() 11437 1726773108.09839: variable 'ansible_host' from source: host vars for 'managed_node2' 11437 1726773108.09847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11437 1726773108.09857: variable 'omit' from source: magic vars 11437 1726773108.09936: variable 'omit' from source: magic vars 11437 1726773108.09976: variable 'omit' from source: magic vars 11437 1726773108.10000: variable '__kernel_settings_profile_dir' from source: role '' all vars 11437 1726773108.10230: variable '__kernel_settings_profile_dir' from source: role '' all vars 11437 1726773108.10300: variable '__kernel_settings_profile_parent' from source: set_fact 11437 1726773108.10311: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11437 1726773108.10348: variable 'omit' from source: magic vars 11437 1726773108.10381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11437 1726773108.10410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11437 1726773108.10429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11437 1726773108.10444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11437 1726773108.10455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11437 1726773108.10478: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11437 1726773108.10483: variable 'ansible_host' from source: host vars for 'managed_node2' 11437 1726773108.10490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11437 1726773108.10561: Set connection var ansible_pipelining to False 11437 1726773108.10569: Set connection var ansible_timeout to 10 11437 1726773108.10576: Set connection var ansible_module_compression to ZIP_DEFLATED 11437 1726773108.10579: Set connection var ansible_shell_type to sh 11437 1726773108.10586: Set connection var ansible_shell_executable to /bin/sh 11437 1726773108.10591: Set connection var ansible_connection to ssh 11437 1726773108.10610: variable 'ansible_shell_executable' from source: unknown 11437 1726773108.10614: variable 'ansible_connection' from source: unknown 11437 1726773108.10618: variable 'ansible_module_compression' from source: unknown 11437 1726773108.10621: variable 'ansible_shell_type' from source: unknown 11437 1726773108.10624: variable 'ansible_shell_executable' from source: unknown 11437 1726773108.10628: variable 'ansible_host' from source: host vars for 'managed_node2' 11437 1726773108.10632: variable 'ansible_pipelining' from source: unknown 11437 1726773108.10635: variable 'ansible_timeout' from source: unknown 11437 1726773108.10639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11437 1726773108.10777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11437 1726773108.10790: variable 'omit' from source: magic vars 11437 1726773108.10796: starting attempt loop 11437 1726773108.10800: running the handler 11437 1726773108.10814: _low_level_execute_command(): starting 11437 1726773108.10822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11437 1726773108.13141: stdout chunk (state=2): >>>/root <<< 11437 1726773108.13264: stderr chunk (state=3): >>><<< 11437 1726773108.13271: stdout chunk (state=3): >>><<< 11437 1726773108.13292: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11437 1726773108.13305: _low_level_execute_command(): starting 11437 1726773108.13312: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230 `" && echo ansible-tmp-1726773108.1329997-11437-4799254640230="` echo /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230 `" ) && sleep 0' 11437 1726773108.15882: stdout chunk (state=2): >>>ansible-tmp-1726773108.1329997-11437-4799254640230=/root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230 <<< 11437 1726773108.16019: stderr chunk (state=3): >>><<< 11437 1726773108.16027: stdout chunk (state=3): >>><<< 11437 1726773108.16042: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.1329997-11437-4799254640230=/root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230 , stderr= 11437 1726773108.16078: variable 'ansible_module_compression' from source: unknown 11437 1726773108.16128: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11437 1726773108.16165: variable 'ansible_facts' from source: unknown 11437 1726773108.16230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/AnsiballZ_file.py 11437 1726773108.16334: Sending initial data 11437 1726773108.16341: Sent initial data (150 bytes) 11437 1726773108.18849: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpirr5wwuy /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/AnsiballZ_file.py <<< 11437 1726773108.19976: stderr chunk (state=3): >>><<< 11437 1726773108.19984: stdout chunk (state=3): >>><<< 11437 1726773108.20006: done transferring module to remote 11437 1726773108.20017: _low_level_execute_command(): starting 11437 1726773108.20022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/ /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/AnsiballZ_file.py && sleep 0' 11437 1726773108.22375: stderr chunk (state=2): >>><<< 11437 1726773108.22384: stdout chunk (state=2): >>><<< 11437 1726773108.22400: _low_level_execute_command() done: rc=0, stdout=, stderr= 11437 1726773108.22406: _low_level_execute_command(): starting 11437 1726773108.22411: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/AnsiballZ_file.py && sleep 0' 11437 1726773108.38659: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11437 1726773108.39780: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11437 1726773108.39832: stderr chunk (state=3): >>><<< 11437 1726773108.39839: stdout chunk (state=3): >>><<< 11437 1726773108.39856: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": false, "diff": {"before": {"path": "/etc/tuned/kernel_settings"}, "after": {"path": "/etc/tuned/kernel_settings"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11437 1726773108.39890: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11437 1726773108.39901: _low_level_execute_command(): starting 11437 1726773108.39907: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.1329997-11437-4799254640230/ > /dev/null 2>&1 && sleep 0' 11437 1726773108.42310: stderr chunk (state=2): >>><<< 11437 1726773108.42321: stdout chunk (state=2): >>><<< 11437 1726773108.42336: _low_level_execute_command() done: rc=0, stdout=, stderr= 11437 1726773108.42343: handler run complete 11437 1726773108.42362: attempt loop complete, returning result 11437 1726773108.42365: _execute() done 11437 1726773108.42368: dumping result to json 11437 1726773108.42374: done dumping result, returning 11437 1726773108.42382: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-885f-bbcf-000000000edd] 11437 1726773108.42389: sending task result for task 0affffe7-6841-885f-bbcf-000000000edd 11437 1726773108.42424: done sending task result for task 0affffe7-6841-885f-bbcf-000000000edd 11437 1726773108.42427: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 24, "state": "directory", "uid": 0 } 8240 1726773108.42584: no more pending results, returning what we have 8240 1726773108.42589: results queue empty 8240 1726773108.42590: checking for any_errors_fatal 8240 1726773108.42608: done checking for any_errors_fatal 8240 1726773108.42609: checking for max_fail_percentage 8240 1726773108.42611: done checking for max_fail_percentage 8240 1726773108.42612: checking to see if all hosts have failed and the running result is not ok 8240 1726773108.42613: done checking to see if all hosts have failed 8240 1726773108.42613: getting the remaining hosts for this loop 8240 1726773108.42614: done getting the remaining hosts for this loop 8240 1726773108.42618: getting the next task for host managed_node2 8240 1726773108.42624: done getting next task for host managed_node2 8240 1726773108.42628: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8240 1726773108.42631: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773108.42641: getting variables 8240 1726773108.42643: in VariableManager get_vars() 8240 1726773108.42677: Calling all_inventory to load vars for managed_node2 8240 1726773108.42680: Calling groups_inventory to load vars for managed_node2 8240 1726773108.42682: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773108.42693: Calling all_plugins_play to load vars for managed_node2 8240 1726773108.42695: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773108.42697: Calling groups_plugins_play to load vars for managed_node2 8240 1726773108.42805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773108.42924: done with get_vars() 8240 1726773108.42932: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.337) 0:01:27.073 **** 8240 1726773108.43004: entering _queue_task() for managed_node2/slurp 8240 1726773108.43165: worker is 1 (out of 1 available) 8240 1726773108.43179: exiting _queue_task() for managed_node2/slurp 8240 1726773108.43193: done queuing things up, now waiting for results queue to drain 8240 1726773108.43195: waiting for pending results... 11448 1726773108.43329: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 11448 1726773108.43460: in run() - task 0affffe7-6841-885f-bbcf-000000000ede 11448 1726773108.43479: variable 'ansible_search_path' from source: unknown 11448 1726773108.43483: variable 'ansible_search_path' from source: unknown 11448 1726773108.43517: calling self._execute() 11448 1726773108.43587: variable 'ansible_host' from source: host vars for 'managed_node2' 11448 1726773108.43596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11448 1726773108.43607: variable 'omit' from source: magic vars 11448 1726773108.43682: variable 'omit' from source: magic vars 11448 1726773108.43722: variable 'omit' from source: magic vars 11448 1726773108.43745: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11448 1726773108.43964: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11448 1726773108.44027: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11448 1726773108.44058: variable 'omit' from source: magic vars 11448 1726773108.44094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11448 1726773108.44122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11448 1726773108.44141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11448 1726773108.44155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11448 1726773108.44166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11448 1726773108.44193: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11448 1726773108.44199: variable 'ansible_host' from source: host vars for 'managed_node2' 11448 1726773108.44206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11448 1726773108.44272: Set connection var ansible_pipelining to False 11448 1726773108.44279: Set connection var ansible_timeout to 10 11448 1726773108.44289: Set connection var ansible_module_compression to ZIP_DEFLATED 11448 1726773108.44293: Set connection var ansible_shell_type to sh 11448 1726773108.44298: Set connection var ansible_shell_executable to /bin/sh 11448 1726773108.44305: Set connection var ansible_connection to ssh 11448 1726773108.44321: variable 'ansible_shell_executable' from source: unknown 11448 1726773108.44326: variable 'ansible_connection' from source: unknown 11448 1726773108.44330: variable 'ansible_module_compression' from source: unknown 11448 1726773108.44333: variable 'ansible_shell_type' from source: unknown 11448 1726773108.44336: variable 'ansible_shell_executable' from source: unknown 11448 1726773108.44340: variable 'ansible_host' from source: host vars for 'managed_node2' 11448 1726773108.44344: variable 'ansible_pipelining' from source: unknown 11448 1726773108.44347: variable 'ansible_timeout' from source: unknown 11448 1726773108.44351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11448 1726773108.44491: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11448 1726773108.44505: variable 'omit' from source: magic vars 11448 1726773108.44512: starting attempt loop 11448 1726773108.44516: running the handler 11448 1726773108.44527: _low_level_execute_command(): starting 11448 1726773108.44535: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11448 1726773108.46846: stdout chunk (state=2): >>>/root <<< 11448 1726773108.46964: stderr chunk (state=3): >>><<< 11448 1726773108.46972: stdout chunk (state=3): >>><<< 11448 1726773108.46993: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11448 1726773108.47010: _low_level_execute_command(): starting 11448 1726773108.47017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351 `" && echo ansible-tmp-1726773108.4700446-11448-169872653982351="` echo /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351 `" ) && sleep 0' 11448 1726773108.49738: stdout chunk (state=2): >>>ansible-tmp-1726773108.4700446-11448-169872653982351=/root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351 <<< 11448 1726773108.49864: stderr chunk (state=3): >>><<< 11448 1726773108.49871: stdout chunk (state=3): >>><<< 11448 1726773108.49889: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.4700446-11448-169872653982351=/root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351 , stderr= 11448 1726773108.49928: variable 'ansible_module_compression' from source: unknown 11448 1726773108.49962: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11448 1726773108.49997: variable 'ansible_facts' from source: unknown 11448 1726773108.50057: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/AnsiballZ_slurp.py 11448 1726773108.50595: Sending initial data 11448 1726773108.50606: Sent initial data (153 bytes) 11448 1726773108.52770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpw0e4oud_ /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/AnsiballZ_slurp.py <<< 11448 1726773108.53860: stderr chunk (state=3): >>><<< 11448 1726773108.53869: stdout chunk (state=3): >>><<< 11448 1726773108.53892: done transferring module to remote 11448 1726773108.53904: _low_level_execute_command(): starting 11448 1726773108.53910: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/ /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/AnsiballZ_slurp.py && sleep 0' 11448 1726773108.56277: stderr chunk (state=2): >>><<< 11448 1726773108.56289: stdout chunk (state=2): >>><<< 11448 1726773108.56305: _low_level_execute_command() done: rc=0, stdout=, stderr= 11448 1726773108.56309: _low_level_execute_command(): starting 11448 1726773108.56314: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/AnsiballZ_slurp.py && sleep 0' 11448 1726773108.71062: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11448 1726773108.72031: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11448 1726773108.72078: stderr chunk (state=3): >>><<< 11448 1726773108.72084: stdout chunk (state=3): >>><<< 11448 1726773108.72102: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 11448 1726773108.72126: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11448 1726773108.72139: _low_level_execute_command(): starting 11448 1726773108.72145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.4700446-11448-169872653982351/ > /dev/null 2>&1 && sleep 0' 11448 1726773108.74563: stderr chunk (state=2): >>><<< 11448 1726773108.74573: stdout chunk (state=2): >>><<< 11448 1726773108.74590: _low_level_execute_command() done: rc=0, stdout=, stderr= 11448 1726773108.74598: handler run complete 11448 1726773108.74614: attempt loop complete, returning result 11448 1726773108.74618: _execute() done 11448 1726773108.74621: dumping result to json 11448 1726773108.74626: done dumping result, returning 11448 1726773108.74633: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-885f-bbcf-000000000ede] 11448 1726773108.74639: sending task result for task 0affffe7-6841-885f-bbcf-000000000ede 11448 1726773108.74666: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ede 11448 1726773108.74670: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773108.74808: no more pending results, returning what we have 8240 1726773108.74812: results queue empty 8240 1726773108.74813: checking for any_errors_fatal 8240 1726773108.74825: done checking for any_errors_fatal 8240 1726773108.74825: checking for max_fail_percentage 8240 1726773108.74826: done checking for max_fail_percentage 8240 1726773108.74827: checking to see if all hosts have failed and the running result is not ok 8240 1726773108.74828: done checking to see if all hosts have failed 8240 1726773108.74829: getting the remaining hosts for this loop 8240 1726773108.74830: done getting the remaining hosts for this loop 8240 1726773108.74833: getting the next task for host managed_node2 8240 1726773108.74841: done getting next task for host managed_node2 8240 1726773108.74844: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8240 1726773108.74847: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773108.74858: getting variables 8240 1726773108.74859: in VariableManager get_vars() 8240 1726773108.74896: Calling all_inventory to load vars for managed_node2 8240 1726773108.74899: Calling groups_inventory to load vars for managed_node2 8240 1726773108.74901: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773108.74911: Calling all_plugins_play to load vars for managed_node2 8240 1726773108.74914: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773108.74916: Calling groups_plugins_play to load vars for managed_node2 8240 1726773108.75031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773108.75153: done with get_vars() 8240 1726773108.75162: done getting variables 8240 1726773108.75210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.322) 0:01:27.396 **** 8240 1726773108.75235: entering _queue_task() for managed_node2/set_fact 8240 1726773108.75401: worker is 1 (out of 1 available) 8240 1726773108.75415: exiting _queue_task() for managed_node2/set_fact 8240 1726773108.75429: done queuing things up, now waiting for results queue to drain 8240 1726773108.75431: waiting for pending results... 11456 1726773108.75562: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 11456 1726773108.75689: in run() - task 0affffe7-6841-885f-bbcf-000000000edf 11456 1726773108.75712: variable 'ansible_search_path' from source: unknown 11456 1726773108.75716: variable 'ansible_search_path' from source: unknown 11456 1726773108.75744: calling self._execute() 11456 1726773108.76166: variable 'ansible_host' from source: host vars for 'managed_node2' 11456 1726773108.76176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11456 1726773108.76186: variable 'omit' from source: magic vars 11456 1726773108.76260: variable 'omit' from source: magic vars 11456 1726773108.76305: variable 'omit' from source: magic vars 11456 1726773108.76587: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11456 1726773108.76597: variable '__cur_profile' from source: task vars 11456 1726773108.76705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11456 1726773108.78220: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11456 1726773108.78281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11456 1726773108.78315: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11456 1726773108.78341: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11456 1726773108.78362: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11456 1726773108.78421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11456 1726773108.78443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11456 1726773108.78461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11456 1726773108.78493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11456 1726773108.78507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11456 1726773108.78589: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11456 1726773108.78637: variable 'omit' from source: magic vars 11456 1726773108.78658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11456 1726773108.78679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11456 1726773108.78698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11456 1726773108.78714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11456 1726773108.78724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11456 1726773108.78747: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11456 1726773108.78753: variable 'ansible_host' from source: host vars for 'managed_node2' 11456 1726773108.78758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11456 1726773108.78828: Set connection var ansible_pipelining to False 11456 1726773108.78835: Set connection var ansible_timeout to 10 11456 1726773108.78843: Set connection var ansible_module_compression to ZIP_DEFLATED 11456 1726773108.78846: Set connection var ansible_shell_type to sh 11456 1726773108.78851: Set connection var ansible_shell_executable to /bin/sh 11456 1726773108.78856: Set connection var ansible_connection to ssh 11456 1726773108.78872: variable 'ansible_shell_executable' from source: unknown 11456 1726773108.78876: variable 'ansible_connection' from source: unknown 11456 1726773108.78879: variable 'ansible_module_compression' from source: unknown 11456 1726773108.78882: variable 'ansible_shell_type' from source: unknown 11456 1726773108.78887: variable 'ansible_shell_executable' from source: unknown 11456 1726773108.78891: variable 'ansible_host' from source: host vars for 'managed_node2' 11456 1726773108.78895: variable 'ansible_pipelining' from source: unknown 11456 1726773108.78898: variable 'ansible_timeout' from source: unknown 11456 1726773108.78906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11456 1726773108.78966: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11456 1726773108.78977: variable 'omit' from source: magic vars 11456 1726773108.78982: starting attempt loop 11456 1726773108.78987: running the handler 11456 1726773108.78997: handler run complete 11456 1726773108.79007: attempt loop complete, returning result 11456 1726773108.79011: _execute() done 11456 1726773108.79014: dumping result to json 11456 1726773108.79018: done dumping result, returning 11456 1726773108.79025: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-885f-bbcf-000000000edf] 11456 1726773108.79030: sending task result for task 0affffe7-6841-885f-bbcf-000000000edf 11456 1726773108.79049: done sending task result for task 0affffe7-6841-885f-bbcf-000000000edf 11456 1726773108.79053: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8240 1726773108.79487: no more pending results, returning what we have 8240 1726773108.79489: results queue empty 8240 1726773108.79490: checking for any_errors_fatal 8240 1726773108.79493: done checking for any_errors_fatal 8240 1726773108.79493: checking for max_fail_percentage 8240 1726773108.79494: done checking for max_fail_percentage 8240 1726773108.79494: checking to see if all hosts have failed and the running result is not ok 8240 1726773108.79495: done checking to see if all hosts have failed 8240 1726773108.79496: getting the remaining hosts for this loop 8240 1726773108.79496: done getting the remaining hosts for this loop 8240 1726773108.79499: getting the next task for host managed_node2 8240 1726773108.79504: done getting next task for host managed_node2 8240 1726773108.79506: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8240 1726773108.79510: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773108.79521: getting variables 8240 1726773108.79522: in VariableManager get_vars() 8240 1726773108.79542: Calling all_inventory to load vars for managed_node2 8240 1726773108.79544: Calling groups_inventory to load vars for managed_node2 8240 1726773108.79545: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773108.79552: Calling all_plugins_play to load vars for managed_node2 8240 1726773108.79554: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773108.79555: Calling groups_plugins_play to load vars for managed_node2 8240 1726773108.79648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773108.79760: done with get_vars() 8240 1726773108.79766: done getting variables 8240 1726773108.79809: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:11:48 -0400 (0:00:00.045) 0:01:27.442 **** 8240 1726773108.79832: entering _queue_task() for managed_node2/copy 8240 1726773108.79995: worker is 1 (out of 1 available) 8240 1726773108.80007: exiting _queue_task() for managed_node2/copy 8240 1726773108.80020: done queuing things up, now waiting for results queue to drain 8240 1726773108.80023: waiting for pending results... 11457 1726773108.80156: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 11457 1726773108.80296: in run() - task 0affffe7-6841-885f-bbcf-000000000ee0 11457 1726773108.80313: variable 'ansible_search_path' from source: unknown 11457 1726773108.80317: variable 'ansible_search_path' from source: unknown 11457 1726773108.80345: calling self._execute() 11457 1726773108.80416: variable 'ansible_host' from source: host vars for 'managed_node2' 11457 1726773108.80424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11457 1726773108.80432: variable 'omit' from source: magic vars 11457 1726773108.80514: variable 'omit' from source: magic vars 11457 1726773108.80558: variable 'omit' from source: magic vars 11457 1726773108.80583: variable '__kernel_settings_active_profile' from source: set_fact 11457 1726773108.80815: variable '__kernel_settings_active_profile' from source: set_fact 11457 1726773108.80838: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11457 1726773108.80893: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 11457 1726773108.80944: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11457 1726773108.80965: variable 'omit' from source: magic vars 11457 1726773108.81007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11457 1726773108.81033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11457 1726773108.81051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11457 1726773108.81065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11457 1726773108.81076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11457 1726773108.81104: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11457 1726773108.81109: variable 'ansible_host' from source: host vars for 'managed_node2' 11457 1726773108.81113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11457 1726773108.81180: Set connection var ansible_pipelining to False 11457 1726773108.81188: Set connection var ansible_timeout to 10 11457 1726773108.81197: Set connection var ansible_module_compression to ZIP_DEFLATED 11457 1726773108.81200: Set connection var ansible_shell_type to sh 11457 1726773108.81209: Set connection var ansible_shell_executable to /bin/sh 11457 1726773108.81214: Set connection var ansible_connection to ssh 11457 1726773108.81231: variable 'ansible_shell_executable' from source: unknown 11457 1726773108.81235: variable 'ansible_connection' from source: unknown 11457 1726773108.81239: variable 'ansible_module_compression' from source: unknown 11457 1726773108.81242: variable 'ansible_shell_type' from source: unknown 11457 1726773108.81246: variable 'ansible_shell_executable' from source: unknown 11457 1726773108.81249: variable 'ansible_host' from source: host vars for 'managed_node2' 11457 1726773108.81255: variable 'ansible_pipelining' from source: unknown 11457 1726773108.81258: variable 'ansible_timeout' from source: unknown 11457 1726773108.81262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11457 1726773108.81352: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11457 1726773108.81364: variable 'omit' from source: magic vars 11457 1726773108.81370: starting attempt loop 11457 1726773108.81374: running the handler 11457 1726773108.81384: _low_level_execute_command(): starting 11457 1726773108.81396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11457 1726773108.83740: stdout chunk (state=2): >>>/root <<< 11457 1726773108.83864: stderr chunk (state=3): >>><<< 11457 1726773108.83872: stdout chunk (state=3): >>><<< 11457 1726773108.83894: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11457 1726773108.83909: _low_level_execute_command(): starting 11457 1726773108.83916: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059 `" && echo ansible-tmp-1726773108.8390255-11457-241179231652059="` echo /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059 `" ) && sleep 0' 11457 1726773108.86480: stdout chunk (state=2): >>>ansible-tmp-1726773108.8390255-11457-241179231652059=/root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059 <<< 11457 1726773108.86620: stderr chunk (state=3): >>><<< 11457 1726773108.86628: stdout chunk (state=3): >>><<< 11457 1726773108.86643: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773108.8390255-11457-241179231652059=/root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059 , stderr= 11457 1726773108.86717: variable 'ansible_module_compression' from source: unknown 11457 1726773108.86767: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11457 1726773108.86804: variable 'ansible_facts' from source: unknown 11457 1726773108.86865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_stat.py 11457 1726773108.86955: Sending initial data 11457 1726773108.86963: Sent initial data (152 bytes) 11457 1726773108.89452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpr_xoh5vr /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_stat.py <<< 11457 1726773108.90554: stderr chunk (state=3): >>><<< 11457 1726773108.90564: stdout chunk (state=3): >>><<< 11457 1726773108.90586: done transferring module to remote 11457 1726773108.90598: _low_level_execute_command(): starting 11457 1726773108.90604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/ /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_stat.py && sleep 0' 11457 1726773108.92962: stderr chunk (state=2): >>><<< 11457 1726773108.92972: stdout chunk (state=2): >>><<< 11457 1726773108.92989: _low_level_execute_command() done: rc=0, stdout=, stderr= 11457 1726773108.92993: _low_level_execute_command(): starting 11457 1726773108.92999: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_stat.py && sleep 0' 11457 1726773109.09090: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773108.7082167, "mtime": 1726773100.6301363, "ctime": 1726773100.6301363, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11457 1726773109.10206: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11457 1726773109.10254: stderr chunk (state=3): >>><<< 11457 1726773109.10261: stdout chunk (state=3): >>><<< 11457 1726773109.10278: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773108.7082167, "mtime": 1726773100.6301363, "ctime": 1726773100.6301363, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11457 1726773109.10324: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11457 1726773109.10360: variable 'ansible_module_compression' from source: unknown 11457 1726773109.10397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11457 1726773109.10418: variable 'ansible_facts' from source: unknown 11457 1726773109.10473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_file.py 11457 1726773109.10564: Sending initial data 11457 1726773109.10573: Sent initial data (152 bytes) 11457 1726773109.13122: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpwxvs4lw_ /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_file.py <<< 11457 1726773109.14263: stderr chunk (state=3): >>><<< 11457 1726773109.14272: stdout chunk (state=3): >>><<< 11457 1726773109.14292: done transferring module to remote 11457 1726773109.14303: _low_level_execute_command(): starting 11457 1726773109.14309: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/ /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_file.py && sleep 0' 11457 1726773109.16662: stderr chunk (state=2): >>><<< 11457 1726773109.16672: stdout chunk (state=2): >>><<< 11457 1726773109.16690: _low_level_execute_command() done: rc=0, stdout=, stderr= 11457 1726773109.16695: _low_level_execute_command(): starting 11457 1726773109.16700: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/AnsiballZ_file.py && sleep 0' 11457 1726773109.32507: stdout chunk (state=2): >>> {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpnwdb8u6d", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11457 1726773109.33630: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11457 1726773109.33680: stderr chunk (state=3): >>><<< 11457 1726773109.33688: stdout chunk (state=3): >>><<< 11457 1726773109.33706: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/active_profile", "changed": false, "diff": {"before": {"path": "/etc/tuned/active_profile"}, "after": {"path": "/etc/tuned/active_profile"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/active_profile", "_original_basename": "tmpnwdb8u6d", "recurse": false, "state": "file", "path": "/etc/tuned/active_profile", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11457 1726773109.33733: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/active_profile', '_original_basename': 'tmpnwdb8u6d', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11457 1726773109.33744: _low_level_execute_command(): starting 11457 1726773109.33750: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773108.8390255-11457-241179231652059/ > /dev/null 2>&1 && sleep 0' 11457 1726773109.36176: stderr chunk (state=2): >>><<< 11457 1726773109.36188: stdout chunk (state=2): >>><<< 11457 1726773109.36205: _low_level_execute_command() done: rc=0, stdout=, stderr= 11457 1726773109.36214: handler run complete 11457 1726773109.36234: attempt loop complete, returning result 11457 1726773109.36238: _execute() done 11457 1726773109.36242: dumping result to json 11457 1726773109.36248: done dumping result, returning 11457 1726773109.36255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-885f-bbcf-000000000ee0] 11457 1726773109.36261: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee0 11457 1726773109.36295: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee0 11457 1726773109.36298: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/active_profile", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "state": "file", "uid": 0 } 8240 1726773109.36466: no more pending results, returning what we have 8240 1726773109.36469: results queue empty 8240 1726773109.36470: checking for any_errors_fatal 8240 1726773109.36478: done checking for any_errors_fatal 8240 1726773109.36478: checking for max_fail_percentage 8240 1726773109.36480: done checking for max_fail_percentage 8240 1726773109.36480: checking to see if all hosts have failed and the running result is not ok 8240 1726773109.36481: done checking to see if all hosts have failed 8240 1726773109.36482: getting the remaining hosts for this loop 8240 1726773109.36483: done getting the remaining hosts for this loop 8240 1726773109.36488: getting the next task for host managed_node2 8240 1726773109.36495: done getting next task for host managed_node2 8240 1726773109.36499: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8240 1726773109.36504: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773109.36515: getting variables 8240 1726773109.36517: in VariableManager get_vars() 8240 1726773109.36551: Calling all_inventory to load vars for managed_node2 8240 1726773109.36553: Calling groups_inventory to load vars for managed_node2 8240 1726773109.36555: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773109.36564: Calling all_plugins_play to load vars for managed_node2 8240 1726773109.36566: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773109.36568: Calling groups_plugins_play to load vars for managed_node2 8240 1726773109.36681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773109.36839: done with get_vars() 8240 1726773109.36847: done getting variables 8240 1726773109.36892: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:11:49 -0400 (0:00:00.570) 0:01:28.013 **** 8240 1726773109.36919: entering _queue_task() for managed_node2/copy 8240 1726773109.37084: worker is 1 (out of 1 available) 8240 1726773109.37100: exiting _queue_task() for managed_node2/copy 8240 1726773109.37116: done queuing things up, now waiting for results queue to drain 8240 1726773109.37120: waiting for pending results... 11472 1726773109.37259: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 11472 1726773109.37395: in run() - task 0affffe7-6841-885f-bbcf-000000000ee1 11472 1726773109.37411: variable 'ansible_search_path' from source: unknown 11472 1726773109.37414: variable 'ansible_search_path' from source: unknown 11472 1726773109.37439: calling self._execute() 11472 1726773109.37508: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726773109.37514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726773109.37520: variable 'omit' from source: magic vars 11472 1726773109.37600: variable 'omit' from source: magic vars 11472 1726773109.37640: variable 'omit' from source: magic vars 11472 1726773109.37661: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11472 1726773109.37894: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 11472 1726773109.37955: variable '__kernel_settings_tuned_dir' from source: role '' all vars 11472 1726773109.37987: variable 'omit' from source: magic vars 11472 1726773109.38020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11472 1726773109.38047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11472 1726773109.38065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11472 1726773109.38079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11472 1726773109.38106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11472 1726773109.38132: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11472 1726773109.38137: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726773109.38142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726773109.38211: Set connection var ansible_pipelining to False 11472 1726773109.38218: Set connection var ansible_timeout to 10 11472 1726773109.38226: Set connection var ansible_module_compression to ZIP_DEFLATED 11472 1726773109.38229: Set connection var ansible_shell_type to sh 11472 1726773109.38235: Set connection var ansible_shell_executable to /bin/sh 11472 1726773109.38240: Set connection var ansible_connection to ssh 11472 1726773109.38255: variable 'ansible_shell_executable' from source: unknown 11472 1726773109.38259: variable 'ansible_connection' from source: unknown 11472 1726773109.38262: variable 'ansible_module_compression' from source: unknown 11472 1726773109.38266: variable 'ansible_shell_type' from source: unknown 11472 1726773109.38269: variable 'ansible_shell_executable' from source: unknown 11472 1726773109.38272: variable 'ansible_host' from source: host vars for 'managed_node2' 11472 1726773109.38277: variable 'ansible_pipelining' from source: unknown 11472 1726773109.38280: variable 'ansible_timeout' from source: unknown 11472 1726773109.38284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11472 1726773109.38376: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11472 1726773109.38389: variable 'omit' from source: magic vars 11472 1726773109.38395: starting attempt loop 11472 1726773109.38399: running the handler 11472 1726773109.38411: _low_level_execute_command(): starting 11472 1726773109.38418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11472 1726773109.40748: stdout chunk (state=2): >>>/root <<< 11472 1726773109.40864: stderr chunk (state=3): >>><<< 11472 1726773109.40871: stdout chunk (state=3): >>><<< 11472 1726773109.40891: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11472 1726773109.40906: _low_level_execute_command(): starting 11472 1726773109.40913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840 `" && echo ansible-tmp-1726773109.409003-11472-121528541463840="` echo /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840 `" ) && sleep 0' 11472 1726773109.43476: stdout chunk (state=2): >>>ansible-tmp-1726773109.409003-11472-121528541463840=/root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840 <<< 11472 1726773109.43609: stderr chunk (state=3): >>><<< 11472 1726773109.43616: stdout chunk (state=3): >>><<< 11472 1726773109.43632: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773109.409003-11472-121528541463840=/root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840 , stderr= 11472 1726773109.43705: variable 'ansible_module_compression' from source: unknown 11472 1726773109.43751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11472 1726773109.43787: variable 'ansible_facts' from source: unknown 11472 1726773109.43849: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_stat.py 11472 1726773109.43936: Sending initial data 11472 1726773109.43945: Sent initial data (151 bytes) 11472 1726773109.46411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp1fmswivi /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_stat.py <<< 11472 1726773109.47527: stderr chunk (state=3): >>><<< 11472 1726773109.47537: stdout chunk (state=3): >>><<< 11472 1726773109.47558: done transferring module to remote 11472 1726773109.47569: _low_level_execute_command(): starting 11472 1726773109.47576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/ /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_stat.py && sleep 0' 11472 1726773109.49948: stderr chunk (state=2): >>><<< 11472 1726773109.49959: stdout chunk (state=2): >>><<< 11472 1726773109.49975: _low_level_execute_command() done: rc=0, stdout=, stderr= 11472 1726773109.49979: _low_level_execute_command(): starting 11472 1726773109.49986: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_stat.py && sleep 0' 11472 1726773109.65849: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773098.8571188, "mtime": 1726773100.6301363, "ctime": 1726773100.6301363, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11472 1726773109.66981: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11472 1726773109.67031: stderr chunk (state=3): >>><<< 11472 1726773109.67038: stdout chunk (state=3): >>><<< 11472 1726773109.67056: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773098.8571188, "mtime": 1726773100.6301363, "ctime": 1726773100.6301363, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11472 1726773109.67105: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11472 1726773109.67141: variable 'ansible_module_compression' from source: unknown 11472 1726773109.67177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11472 1726773109.67200: variable 'ansible_facts' from source: unknown 11472 1726773109.67256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_file.py 11472 1726773109.67346: Sending initial data 11472 1726773109.67354: Sent initial data (151 bytes) 11472 1726773109.69910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmputg9t492 /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_file.py <<< 11472 1726773109.71041: stderr chunk (state=3): >>><<< 11472 1726773109.71051: stdout chunk (state=3): >>><<< 11472 1726773109.71072: done transferring module to remote 11472 1726773109.71082: _low_level_execute_command(): starting 11472 1726773109.71088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/ /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_file.py && sleep 0' 11472 1726773109.73470: stderr chunk (state=2): >>><<< 11472 1726773109.73479: stdout chunk (state=2): >>><<< 11472 1726773109.73495: _low_level_execute_command() done: rc=0, stdout=, stderr= 11472 1726773109.73499: _low_level_execute_command(): starting 11472 1726773109.73505: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/AnsiballZ_file.py && sleep 0' 11472 1726773109.89642: stdout chunk (state=2): >>> {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmph1zusq8y", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11472 1726773109.90761: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11472 1726773109.90812: stderr chunk (state=3): >>><<< 11472 1726773109.90820: stdout chunk (state=3): >>><<< 11472 1726773109.90836: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/profile_mode", "changed": false, "diff": {"before": {"path": "/etc/tuned/profile_mode"}, "after": {"path": "/etc/tuned/profile_mode"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"mode": "0600", "dest": "/etc/tuned/profile_mode", "_original_basename": "tmph1zusq8y", "recurse": false, "state": "file", "path": "/etc/tuned/profile_mode", "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11472 1726773109.90868: done with _execute_module (ansible.legacy.file, {'mode': '0600', 'dest': '/etc/tuned/profile_mode', '_original_basename': 'tmph1zusq8y', 'recurse': False, 'state': 'file', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11472 1726773109.90881: _low_level_execute_command(): starting 11472 1726773109.90888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773109.409003-11472-121528541463840/ > /dev/null 2>&1 && sleep 0' 11472 1726773109.93316: stderr chunk (state=2): >>><<< 11472 1726773109.93326: stdout chunk (state=2): >>><<< 11472 1726773109.93342: _low_level_execute_command() done: rc=0, stdout=, stderr= 11472 1726773109.93351: handler run complete 11472 1726773109.93371: attempt loop complete, returning result 11472 1726773109.93375: _execute() done 11472 1726773109.93378: dumping result to json 11472 1726773109.93383: done dumping result, returning 11472 1726773109.93392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-885f-bbcf-000000000ee1] 11472 1726773109.93397: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee1 11472 1726773109.93431: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee1 11472 1726773109.93435: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/etc/tuned/profile_mode", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "state": "file", "uid": 0 } 8240 1726773109.93612: no more pending results, returning what we have 8240 1726773109.93616: results queue empty 8240 1726773109.93617: checking for any_errors_fatal 8240 1726773109.93623: done checking for any_errors_fatal 8240 1726773109.93624: checking for max_fail_percentage 8240 1726773109.93625: done checking for max_fail_percentage 8240 1726773109.93626: checking to see if all hosts have failed and the running result is not ok 8240 1726773109.93627: done checking to see if all hosts have failed 8240 1726773109.93627: getting the remaining hosts for this loop 8240 1726773109.93628: done getting the remaining hosts for this loop 8240 1726773109.93632: getting the next task for host managed_node2 8240 1726773109.93639: done getting next task for host managed_node2 8240 1726773109.93643: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8240 1726773109.93647: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773109.93657: getting variables 8240 1726773109.93659: in VariableManager get_vars() 8240 1726773109.93695: Calling all_inventory to load vars for managed_node2 8240 1726773109.93698: Calling groups_inventory to load vars for managed_node2 8240 1726773109.93699: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773109.93710: Calling all_plugins_play to load vars for managed_node2 8240 1726773109.93713: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773109.93715: Calling groups_plugins_play to load vars for managed_node2 8240 1726773109.93826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773109.93952: done with get_vars() 8240 1726773109.93961: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:11:49 -0400 (0:00:00.571) 0:01:28.584 **** 8240 1726773109.94031: entering _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773109.94206: worker is 1 (out of 1 available) 8240 1726773109.94221: exiting _queue_task() for managed_node2/fedora.linux_system_roles.kernel_settings_get_config 8240 1726773109.94235: done queuing things up, now waiting for results queue to drain 8240 1726773109.94237: waiting for pending results... 11484 1726773109.94369: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config 11484 1726773109.94508: in run() - task 0affffe7-6841-885f-bbcf-000000000ee2 11484 1726773109.94525: variable 'ansible_search_path' from source: unknown 11484 1726773109.94528: variable 'ansible_search_path' from source: unknown 11484 1726773109.94557: calling self._execute() 11484 1726773109.94631: variable 'ansible_host' from source: host vars for 'managed_node2' 11484 1726773109.94640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11484 1726773109.94650: variable 'omit' from source: magic vars 11484 1726773109.94731: variable 'omit' from source: magic vars 11484 1726773109.94776: variable 'omit' from source: magic vars 11484 1726773109.94804: variable '__kernel_settings_profile_filename' from source: role '' all vars 11484 1726773109.95031: variable '__kernel_settings_profile_filename' from source: role '' all vars 11484 1726773109.95094: variable '__kernel_settings_profile_dir' from source: role '' all vars 11484 1726773109.95160: variable '__kernel_settings_profile_parent' from source: set_fact 11484 1726773109.95170: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11484 1726773109.95265: variable 'omit' from source: magic vars 11484 1726773109.95301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11484 1726773109.95329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11484 1726773109.95349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11484 1726773109.95364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11484 1726773109.95376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11484 1726773109.95402: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11484 1726773109.95408: variable 'ansible_host' from source: host vars for 'managed_node2' 11484 1726773109.95412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11484 1726773109.95481: Set connection var ansible_pipelining to False 11484 1726773109.95491: Set connection var ansible_timeout to 10 11484 1726773109.95498: Set connection var ansible_module_compression to ZIP_DEFLATED 11484 1726773109.95502: Set connection var ansible_shell_type to sh 11484 1726773109.95507: Set connection var ansible_shell_executable to /bin/sh 11484 1726773109.95512: Set connection var ansible_connection to ssh 11484 1726773109.95528: variable 'ansible_shell_executable' from source: unknown 11484 1726773109.95532: variable 'ansible_connection' from source: unknown 11484 1726773109.95535: variable 'ansible_module_compression' from source: unknown 11484 1726773109.95539: variable 'ansible_shell_type' from source: unknown 11484 1726773109.95542: variable 'ansible_shell_executable' from source: unknown 11484 1726773109.95546: variable 'ansible_host' from source: host vars for 'managed_node2' 11484 1726773109.95550: variable 'ansible_pipelining' from source: unknown 11484 1726773109.95552: variable 'ansible_timeout' from source: unknown 11484 1726773109.95554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11484 1726773109.95674: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11484 1726773109.95684: variable 'omit' from source: magic vars 11484 1726773109.95691: starting attempt loop 11484 1726773109.95693: running the handler 11484 1726773109.95703: _low_level_execute_command(): starting 11484 1726773109.95709: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11484 1726773109.97994: stdout chunk (state=2): >>>/root <<< 11484 1726773109.98118: stderr chunk (state=3): >>><<< 11484 1726773109.98125: stdout chunk (state=3): >>><<< 11484 1726773109.98144: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11484 1726773109.98157: _low_level_execute_command(): starting 11484 1726773109.98164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848 `" && echo ansible-tmp-1726773109.9815168-11484-183532224159848="` echo /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848 `" ) && sleep 0' 11484 1726773110.00807: stdout chunk (state=2): >>>ansible-tmp-1726773109.9815168-11484-183532224159848=/root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848 <<< 11484 1726773110.00939: stderr chunk (state=3): >>><<< 11484 1726773110.00947: stdout chunk (state=3): >>><<< 11484 1726773110.00963: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773109.9815168-11484-183532224159848=/root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848 , stderr= 11484 1726773110.01003: variable 'ansible_module_compression' from source: unknown 11484 1726773110.01039: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 11484 1726773110.01074: variable 'ansible_facts' from source: unknown 11484 1726773110.01134: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/AnsiballZ_kernel_settings_get_config.py 11484 1726773110.01237: Sending initial data 11484 1726773110.01244: Sent initial data (174 bytes) 11484 1726773110.03744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpdr68k6fz /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/AnsiballZ_kernel_settings_get_config.py <<< 11484 1726773110.04819: stderr chunk (state=3): >>><<< 11484 1726773110.04828: stdout chunk (state=3): >>><<< 11484 1726773110.04848: done transferring module to remote 11484 1726773110.04859: _low_level_execute_command(): starting 11484 1726773110.04864: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/ /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11484 1726773110.07200: stderr chunk (state=2): >>><<< 11484 1726773110.07211: stdout chunk (state=2): >>><<< 11484 1726773110.07226: _low_level_execute_command() done: rc=0, stdout=, stderr= 11484 1726773110.07231: _low_level_execute_command(): starting 11484 1726773110.07236: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/AnsiballZ_kernel_settings_get_config.py && sleep 0' 11484 1726773110.22719: stdout chunk (state=2): >>> {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 11484 1726773110.23800: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11484 1726773110.23848: stderr chunk (state=3): >>><<< 11484 1726773110.23855: stdout chunk (state=3): >>><<< 11484 1726773110.23870: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"main": {"summary": "kernel settings"}, "vm": {"transparent_hugepages": "never"}}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.9.64 closed. 11484 1726773110.23897: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11484 1726773110.23911: _low_level_execute_command(): starting 11484 1726773110.23917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773109.9815168-11484-183532224159848/ > /dev/null 2>&1 && sleep 0' 11484 1726773110.26332: stderr chunk (state=2): >>><<< 11484 1726773110.26341: stdout chunk (state=2): >>><<< 11484 1726773110.26356: _low_level_execute_command() done: rc=0, stdout=, stderr= 11484 1726773110.26363: handler run complete 11484 1726773110.26379: attempt loop complete, returning result 11484 1726773110.26383: _execute() done 11484 1726773110.26387: dumping result to json 11484 1726773110.26394: done dumping result, returning 11484 1726773110.26404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-885f-bbcf-000000000ee2] 11484 1726773110.26410: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee2 11484 1726773110.26439: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee2 11484 1726773110.26442: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "data": { "main": { "summary": "kernel settings" }, "vm": { "transparent_hugepages": "never" } } } 8240 1726773110.26603: no more pending results, returning what we have 8240 1726773110.26607: results queue empty 8240 1726773110.26608: checking for any_errors_fatal 8240 1726773110.26615: done checking for any_errors_fatal 8240 1726773110.26615: checking for max_fail_percentage 8240 1726773110.26617: done checking for max_fail_percentage 8240 1726773110.26617: checking to see if all hosts have failed and the running result is not ok 8240 1726773110.26618: done checking to see if all hosts have failed 8240 1726773110.26619: getting the remaining hosts for this loop 8240 1726773110.26620: done getting the remaining hosts for this loop 8240 1726773110.26623: getting the next task for host managed_node2 8240 1726773110.26630: done getting next task for host managed_node2 8240 1726773110.26633: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8240 1726773110.26637: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773110.26648: getting variables 8240 1726773110.26650: in VariableManager get_vars() 8240 1726773110.26683: Calling all_inventory to load vars for managed_node2 8240 1726773110.26688: Calling groups_inventory to load vars for managed_node2 8240 1726773110.26690: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773110.26699: Calling all_plugins_play to load vars for managed_node2 8240 1726773110.26702: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773110.26704: Calling groups_plugins_play to load vars for managed_node2 8240 1726773110.26856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773110.26975: done with get_vars() 8240 1726773110.26983: done getting variables 8240 1726773110.27027: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:11:50 -0400 (0:00:00.330) 0:01:28.914 **** 8240 1726773110.27052: entering _queue_task() for managed_node2/template 8240 1726773110.27218: worker is 1 (out of 1 available) 8240 1726773110.27232: exiting _queue_task() for managed_node2/template 8240 1726773110.27246: done queuing things up, now waiting for results queue to drain 8240 1726773110.27248: waiting for pending results... 11495 1726773110.27390: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 11495 1726773110.27520: in run() - task 0affffe7-6841-885f-bbcf-000000000ee3 11495 1726773110.27536: variable 'ansible_search_path' from source: unknown 11495 1726773110.27540: variable 'ansible_search_path' from source: unknown 11495 1726773110.27569: calling self._execute() 11495 1726773110.27642: variable 'ansible_host' from source: host vars for 'managed_node2' 11495 1726773110.27650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11495 1726773110.27659: variable 'omit' from source: magic vars 11495 1726773110.27737: variable 'omit' from source: magic vars 11495 1726773110.27775: variable 'omit' from source: magic vars 11495 1726773110.28020: variable '__kernel_settings_profile_src' from source: role '' all vars 11495 1726773110.28030: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11495 1726773110.28088: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11495 1726773110.28112: variable '__kernel_settings_profile_filename' from source: role '' all vars 11495 1726773110.28158: variable '__kernel_settings_profile_filename' from source: role '' all vars 11495 1726773110.28211: variable '__kernel_settings_profile_dir' from source: role '' all vars 11495 1726773110.28275: variable '__kernel_settings_profile_parent' from source: set_fact 11495 1726773110.28283: variable '__kernel_settings_tuned_profile' from source: role '' all vars 11495 1726773110.28312: variable 'omit' from source: magic vars 11495 1726773110.28345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11495 1726773110.28370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11495 1726773110.28392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11495 1726773110.28409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11495 1726773110.28421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11495 1726773110.28446: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11495 1726773110.28451: variable 'ansible_host' from source: host vars for 'managed_node2' 11495 1726773110.28456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11495 1726773110.28528: Set connection var ansible_pipelining to False 11495 1726773110.28536: Set connection var ansible_timeout to 10 11495 1726773110.28543: Set connection var ansible_module_compression to ZIP_DEFLATED 11495 1726773110.28547: Set connection var ansible_shell_type to sh 11495 1726773110.28552: Set connection var ansible_shell_executable to /bin/sh 11495 1726773110.28556: Set connection var ansible_connection to ssh 11495 1726773110.28570: variable 'ansible_shell_executable' from source: unknown 11495 1726773110.28573: variable 'ansible_connection' from source: unknown 11495 1726773110.28575: variable 'ansible_module_compression' from source: unknown 11495 1726773110.28576: variable 'ansible_shell_type' from source: unknown 11495 1726773110.28578: variable 'ansible_shell_executable' from source: unknown 11495 1726773110.28580: variable 'ansible_host' from source: host vars for 'managed_node2' 11495 1726773110.28582: variable 'ansible_pipelining' from source: unknown 11495 1726773110.28583: variable 'ansible_timeout' from source: unknown 11495 1726773110.28588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11495 1726773110.28675: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11495 1726773110.28687: variable 'omit' from source: magic vars 11495 1726773110.28692: starting attempt loop 11495 1726773110.28695: running the handler 11495 1726773110.28706: _low_level_execute_command(): starting 11495 1726773110.28712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11495 1726773110.31009: stdout chunk (state=2): >>>/root <<< 11495 1726773110.31134: stderr chunk (state=3): >>><<< 11495 1726773110.31141: stdout chunk (state=3): >>><<< 11495 1726773110.31158: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11495 1726773110.31172: _low_level_execute_command(): starting 11495 1726773110.31179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949 `" && echo ansible-tmp-1726773110.3116653-11495-48878237142949="` echo /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949 `" ) && sleep 0' 11495 1726773110.33689: stdout chunk (state=2): >>>ansible-tmp-1726773110.3116653-11495-48878237142949=/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949 <<< 11495 1726773110.33818: stderr chunk (state=3): >>><<< 11495 1726773110.33825: stdout chunk (state=3): >>><<< 11495 1726773110.33840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773110.3116653-11495-48878237142949=/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949 , stderr= 11495 1726773110.33857: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 11495 1726773110.33874: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 11495 1726773110.33897: variable 'ansible_search_path' from source: unknown 11495 1726773110.34450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11495 1726773110.35954: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11495 1726773110.36010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11495 1726773110.36041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11495 1726773110.36068: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11495 1726773110.36100: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11495 1726773110.36288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11495 1726773110.36312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11495 1726773110.36334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11495 1726773110.36361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11495 1726773110.36373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11495 1726773110.36599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11495 1726773110.36619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11495 1726773110.36637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11495 1726773110.36664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11495 1726773110.36676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11495 1726773110.36921: variable 'ansible_managed' from source: unknown 11495 1726773110.36929: variable '__sections' from source: task vars 11495 1726773110.37020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11495 1726773110.37038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11495 1726773110.37055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11495 1726773110.37080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11495 1726773110.37093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11495 1726773110.37164: variable 'kernel_settings_sysctl' from source: include params 11495 1726773110.37171: variable '__kernel_settings_state_empty' from source: role '' all vars 11495 1726773110.37181: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11495 1726773110.37217: variable '__sysctl_old' from source: task vars 11495 1726773110.37262: variable '__sysctl_old' from source: task vars 11495 1726773110.37404: variable 'kernel_settings_purge' from source: include params 11495 1726773110.37411: variable 'kernel_settings_sysctl' from source: include params 11495 1726773110.37416: variable '__kernel_settings_state_empty' from source: role '' all vars 11495 1726773110.37421: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11495 1726773110.37426: variable '__kernel_settings_profile_contents' from source: set_fact 11495 1726773110.37553: variable 'kernel_settings_sysfs' from source: include params 11495 1726773110.37560: variable '__kernel_settings_state_empty' from source: role '' all vars 11495 1726773110.37566: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11495 1726773110.37580: variable '__sysfs_old' from source: task vars 11495 1726773110.37625: variable '__sysfs_old' from source: task vars 11495 1726773110.37758: variable 'kernel_settings_purge' from source: include params 11495 1726773110.37765: variable 'kernel_settings_sysfs' from source: include params 11495 1726773110.37770: variable '__kernel_settings_state_empty' from source: role '' all vars 11495 1726773110.37775: variable '__kernel_settings_previous_replaced' from source: role '' all vars 11495 1726773110.37780: variable '__kernel_settings_profile_contents' from source: set_fact 11495 1726773110.37796: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 11495 1726773110.37807: variable '__systemd_old' from source: task vars 11495 1726773110.37854: variable '__systemd_old' from source: task vars 11495 1726773110.37983: variable 'kernel_settings_purge' from source: include params 11495 1726773110.37991: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 11495 1726773110.37996: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.38004: variable '__kernel_settings_profile_contents' from source: set_fact 11495 1726773110.38014: variable 'kernel_settings_transparent_hugepages' from source: include params 11495 1726773110.38019: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 11495 1726773110.38024: variable '__trans_huge_old' from source: task vars 11495 1726773110.38063: variable '__trans_huge_old' from source: task vars 11495 1726773110.38193: variable 'kernel_settings_purge' from source: include params 11495 1726773110.38200: variable 'kernel_settings_transparent_hugepages' from source: include params 11495 1726773110.38207: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.38213: variable '__kernel_settings_profile_contents' from source: set_fact 11495 1726773110.38222: variable '__trans_defrag_old' from source: task vars 11495 1726773110.38261: variable '__trans_defrag_old' from source: task vars 11495 1726773110.38389: variable 'kernel_settings_purge' from source: include params 11495 1726773110.38396: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 11495 1726773110.38401: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.38409: variable '__kernel_settings_profile_contents' from source: set_fact 11495 1726773110.38425: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.38436: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.38442: variable '__kernel_settings_state_absent' from source: role '' all vars 11495 1726773110.39089: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11495 1726773110.39134: variable 'ansible_module_compression' from source: unknown 11495 1726773110.39177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11495 1726773110.39213: variable 'ansible_facts' from source: unknown 11495 1726773110.39273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_stat.py 11495 1726773110.39367: Sending initial data 11495 1726773110.39374: Sent initial data (151 bytes) 11495 1726773110.41945: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpuwir89fv /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_stat.py <<< 11495 1726773110.43057: stderr chunk (state=3): >>><<< 11495 1726773110.43066: stdout chunk (state=3): >>><<< 11495 1726773110.43087: done transferring module to remote 11495 1726773110.43098: _low_level_execute_command(): starting 11495 1726773110.43103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/ /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_stat.py && sleep 0' 11495 1726773110.45502: stderr chunk (state=2): >>><<< 11495 1726773110.45514: stdout chunk (state=2): >>><<< 11495 1726773110.45529: _low_level_execute_command() done: rc=0, stdout=, stderr= 11495 1726773110.45534: _low_level_execute_command(): starting 11495 1726773110.45539: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_stat.py && sleep 0' 11495 1726773110.61646: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395467, "dev": 51713, "nlink": 1, "atime": 1726773100.6181362, "mtime": 1726773099.8711288, "ctime": 1726773100.1151311, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "2129606864", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11495 1726773110.62814: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11495 1726773110.62861: stderr chunk (state=3): >>><<< 11495 1726773110.62868: stdout chunk (state=3): >>><<< 11495 1726773110.62887: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 121, "inode": 224395467, "dev": 51713, "nlink": 1, "atime": 1726773100.6181362, "mtime": 1726773099.8711288, "ctime": 1726773100.1151311, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "0b586509c0bdce12a2dde058e3374dab88cf7f2c", "mimetype": "text/plain", "charset": "us-ascii", "version": "2129606864", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11495 1726773110.62946: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11495 1726773110.63038: Sending initial data 11495 1726773110.63045: Sent initial data (159 bytes) 11495 1726773110.65644: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpctbcfh_g/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source <<< 11495 1726773110.66029: stderr chunk (state=3): >>><<< 11495 1726773110.66039: stdout chunk (state=3): >>><<< 11495 1726773110.66055: _low_level_execute_command(): starting 11495 1726773110.66061: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/ /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source && sleep 0' 11495 1726773110.68419: stderr chunk (state=2): >>><<< 11495 1726773110.68430: stdout chunk (state=2): >>><<< 11495 1726773110.68446: _low_level_execute_command() done: rc=0, stdout=, stderr= 11495 1726773110.68467: variable 'ansible_module_compression' from source: unknown 11495 1726773110.68503: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11495 1726773110.68522: variable 'ansible_facts' from source: unknown 11495 1726773110.68579: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_copy.py 11495 1726773110.68670: Sending initial data 11495 1726773110.68677: Sent initial data (151 bytes) 11495 1726773110.71221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpwe_jtnh3 /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_copy.py <<< 11495 1726773110.72359: stderr chunk (state=3): >>><<< 11495 1726773110.72370: stdout chunk (state=3): >>><<< 11495 1726773110.72392: done transferring module to remote 11495 1726773110.72402: _low_level_execute_command(): starting 11495 1726773110.72408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/ /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_copy.py && sleep 0' 11495 1726773110.74797: stderr chunk (state=2): >>><<< 11495 1726773110.74811: stdout chunk (state=2): >>><<< 11495 1726773110.74826: _low_level_execute_command() done: rc=0, stdout=, stderr= 11495 1726773110.74831: _low_level_execute_command(): starting 11495 1726773110.74837: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/AnsiballZ_copy.py && sleep 0' 11495 1726773110.91159: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11495 1726773110.92286: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11495 1726773110.92334: stderr chunk (state=3): >>><<< 11495 1726773110.92341: stdout chunk (state=3): >>><<< 11495 1726773110.92357: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11495 1726773110.92386: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11495 1726773110.92417: _low_level_execute_command(): starting 11495 1726773110.92424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/ > /dev/null 2>&1 && sleep 0' 11495 1726773110.94831: stderr chunk (state=2): >>><<< 11495 1726773110.94841: stdout chunk (state=2): >>><<< 11495 1726773110.94856: _low_level_execute_command() done: rc=0, stdout=, stderr= 11495 1726773110.94866: handler run complete 11495 1726773110.94887: attempt loop complete, returning result 11495 1726773110.94891: _execute() done 11495 1726773110.94894: dumping result to json 11495 1726773110.94903: done dumping result, returning 11495 1726773110.94912: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-885f-bbcf-000000000ee3] 11495 1726773110.94917: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee3 11495 1726773110.94960: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee3 11495 1726773110.94964: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726773110.3116653-11495-48878237142949/source", "state": "file", "uid": 0 } 8240 1726773110.95175: no more pending results, returning what we have 8240 1726773110.95179: results queue empty 8240 1726773110.95180: checking for any_errors_fatal 8240 1726773110.95188: done checking for any_errors_fatal 8240 1726773110.95188: checking for max_fail_percentage 8240 1726773110.95190: done checking for max_fail_percentage 8240 1726773110.95190: checking to see if all hosts have failed and the running result is not ok 8240 1726773110.95192: done checking to see if all hosts have failed 8240 1726773110.95192: getting the remaining hosts for this loop 8240 1726773110.95193: done getting the remaining hosts for this loop 8240 1726773110.95197: getting the next task for host managed_node2 8240 1726773110.95204: done getting next task for host managed_node2 8240 1726773110.95207: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8240 1726773110.95211: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773110.95222: getting variables 8240 1726773110.95224: in VariableManager get_vars() 8240 1726773110.95257: Calling all_inventory to load vars for managed_node2 8240 1726773110.95259: Calling groups_inventory to load vars for managed_node2 8240 1726773110.95260: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773110.95268: Calling all_plugins_play to load vars for managed_node2 8240 1726773110.95270: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773110.95272: Calling groups_plugins_play to load vars for managed_node2 8240 1726773110.95381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773110.95509: done with get_vars() 8240 1726773110.95517: done getting variables 8240 1726773110.95560: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:11:50 -0400 (0:00:00.685) 0:01:29.599 **** 8240 1726773110.95591: entering _queue_task() for managed_node2/service 8240 1726773110.95761: worker is 1 (out of 1 available) 8240 1726773110.95776: exiting _queue_task() for managed_node2/service 8240 1726773110.95791: done queuing things up, now waiting for results queue to drain 8240 1726773110.95793: waiting for pending results... 11510 1726773110.95932: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 11510 1726773110.96065: in run() - task 0affffe7-6841-885f-bbcf-000000000ee4 11510 1726773110.96081: variable 'ansible_search_path' from source: unknown 11510 1726773110.96086: variable 'ansible_search_path' from source: unknown 11510 1726773110.96126: variable '__kernel_settings_services' from source: include_vars 11510 1726773110.96372: variable '__kernel_settings_services' from source: include_vars 11510 1726773110.96521: variable 'omit' from source: magic vars 11510 1726773110.96600: variable 'ansible_host' from source: host vars for 'managed_node2' 11510 1726773110.96614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11510 1726773110.96623: variable 'omit' from source: magic vars 11510 1726773110.96816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11510 1726773110.96991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11510 1726773110.97030: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11510 1726773110.97056: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11510 1726773110.97081: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11510 1726773110.97164: variable '__kernel_settings_register_profile' from source: set_fact 11510 1726773110.97176: variable '__kernel_settings_register_mode' from source: set_fact 11510 1726773110.97196: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): False 11510 1726773110.97200: when evaluation is False, skipping this task 11510 1726773110.97227: variable 'item' from source: unknown 11510 1726773110.97275: variable 'item' from source: unknown skipping: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed", "item": "tuned", "skip_reason": "Conditional result was False" } 11510 1726773110.97309: dumping result to json 11510 1726773110.97315: done dumping result, returning 11510 1726773110.97320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-885f-bbcf-000000000ee4] 11510 1726773110.97326: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee4 11510 1726773110.97349: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee4 11510 1726773110.97352: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 8240 1726773110.97518: no more pending results, returning what we have 8240 1726773110.97521: results queue empty 8240 1726773110.97522: checking for any_errors_fatal 8240 1726773110.97534: done checking for any_errors_fatal 8240 1726773110.97535: checking for max_fail_percentage 8240 1726773110.97536: done checking for max_fail_percentage 8240 1726773110.97537: checking to see if all hosts have failed and the running result is not ok 8240 1726773110.97538: done checking to see if all hosts have failed 8240 1726773110.97538: getting the remaining hosts for this loop 8240 1726773110.97540: done getting the remaining hosts for this loop 8240 1726773110.97543: getting the next task for host managed_node2 8240 1726773110.97550: done getting next task for host managed_node2 8240 1726773110.97553: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8240 1726773110.97557: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773110.97574: getting variables 8240 1726773110.97576: in VariableManager get_vars() 8240 1726773110.97607: Calling all_inventory to load vars for managed_node2 8240 1726773110.97610: Calling groups_inventory to load vars for managed_node2 8240 1726773110.97611: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773110.97619: Calling all_plugins_play to load vars for managed_node2 8240 1726773110.97621: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773110.97622: Calling groups_plugins_play to load vars for managed_node2 8240 1726773110.97728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773110.97850: done with get_vars() 8240 1726773110.97857: done getting variables 8240 1726773110.97902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:11:50 -0400 (0:00:00.023) 0:01:29.623 **** 8240 1726773110.97926: entering _queue_task() for managed_node2/command 8240 1726773110.98097: worker is 1 (out of 1 available) 8240 1726773110.98111: exiting _queue_task() for managed_node2/command 8240 1726773110.98126: done queuing things up, now waiting for results queue to drain 8240 1726773110.98127: waiting for pending results... 11511 1726773110.98262: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 11511 1726773110.98398: in run() - task 0affffe7-6841-885f-bbcf-000000000ee5 11511 1726773110.98418: variable 'ansible_search_path' from source: unknown 11511 1726773110.98422: variable 'ansible_search_path' from source: unknown 11511 1726773110.98451: calling self._execute() 11511 1726773110.98526: variable 'ansible_host' from source: host vars for 'managed_node2' 11511 1726773110.98535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11511 1726773110.98543: variable 'omit' from source: magic vars 11511 1726773110.98906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11511 1726773110.99206: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11511 1726773110.99243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11511 1726773110.99268: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11511 1726773110.99296: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11511 1726773110.99382: variable '__kernel_settings_register_profile' from source: set_fact 11511 1726773110.99410: Evaluated conditional (not __kernel_settings_register_profile is changed): True 11511 1726773110.99504: variable '__kernel_settings_register_mode' from source: set_fact 11511 1726773110.99516: Evaluated conditional (not __kernel_settings_register_mode is changed): True 11511 1726773110.99591: variable '__kernel_settings_register_apply' from source: set_fact 11511 1726773110.99604: Evaluated conditional (__kernel_settings_register_apply is changed): True 11511 1726773110.99611: variable 'omit' from source: magic vars 11511 1726773110.99647: variable 'omit' from source: magic vars 11511 1726773110.99733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11511 1726773111.01222: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11511 1726773111.01280: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11511 1726773111.01313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11511 1726773111.01338: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11511 1726773111.01358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11511 1726773111.01425: variable '__kernel_settings_active_profile' from source: set_fact 11511 1726773111.01455: variable 'omit' from source: magic vars 11511 1726773111.01479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11511 1726773111.01505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11511 1726773111.01522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11511 1726773111.01536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11511 1726773111.01546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11511 1726773111.01570: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11511 1726773111.01576: variable 'ansible_host' from source: host vars for 'managed_node2' 11511 1726773111.01581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11511 1726773111.01654: Set connection var ansible_pipelining to False 11511 1726773111.01661: Set connection var ansible_timeout to 10 11511 1726773111.01669: Set connection var ansible_module_compression to ZIP_DEFLATED 11511 1726773111.01673: Set connection var ansible_shell_type to sh 11511 1726773111.01678: Set connection var ansible_shell_executable to /bin/sh 11511 1726773111.01682: Set connection var ansible_connection to ssh 11511 1726773111.01704: variable 'ansible_shell_executable' from source: unknown 11511 1726773111.01709: variable 'ansible_connection' from source: unknown 11511 1726773111.01713: variable 'ansible_module_compression' from source: unknown 11511 1726773111.01716: variable 'ansible_shell_type' from source: unknown 11511 1726773111.01720: variable 'ansible_shell_executable' from source: unknown 11511 1726773111.01723: variable 'ansible_host' from source: host vars for 'managed_node2' 11511 1726773111.01728: variable 'ansible_pipelining' from source: unknown 11511 1726773111.01731: variable 'ansible_timeout' from source: unknown 11511 1726773111.01736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11511 1726773111.01806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11511 1726773111.01818: variable 'omit' from source: magic vars 11511 1726773111.01824: starting attempt loop 11511 1726773111.01828: running the handler 11511 1726773111.01840: _low_level_execute_command(): starting 11511 1726773111.01847: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11511 1726773111.04179: stdout chunk (state=2): >>>/root <<< 11511 1726773111.04294: stderr chunk (state=3): >>><<< 11511 1726773111.04305: stdout chunk (state=3): >>><<< 11511 1726773111.04325: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11511 1726773111.04338: _low_level_execute_command(): starting 11511 1726773111.04344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074 `" && echo ansible-tmp-1726773111.0433347-11511-1660756481074="` echo /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074 `" ) && sleep 0' 11511 1726773111.06951: stdout chunk (state=2): >>>ansible-tmp-1726773111.0433347-11511-1660756481074=/root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074 <<< 11511 1726773111.07085: stderr chunk (state=3): >>><<< 11511 1726773111.07096: stdout chunk (state=3): >>><<< 11511 1726773111.07116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773111.0433347-11511-1660756481074=/root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074 , stderr= 11511 1726773111.07142: variable 'ansible_module_compression' from source: unknown 11511 1726773111.07182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11511 1726773111.07219: variable 'ansible_facts' from source: unknown 11511 1726773111.07278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/AnsiballZ_command.py 11511 1726773111.07386: Sending initial data 11511 1726773111.07394: Sent initial data (153 bytes) 11511 1726773111.09912: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp00s2e0vt /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/AnsiballZ_command.py <<< 11511 1726773111.11032: stderr chunk (state=3): >>><<< 11511 1726773111.11042: stdout chunk (state=3): >>><<< 11511 1726773111.11063: done transferring module to remote 11511 1726773111.11074: _low_level_execute_command(): starting 11511 1726773111.11080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/ /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/AnsiballZ_command.py && sleep 0' 11511 1726773111.13518: stderr chunk (state=2): >>><<< 11511 1726773111.13529: stdout chunk (state=2): >>><<< 11511 1726773111.13545: _low_level_execute_command() done: rc=0, stdout=, stderr= 11511 1726773111.13550: _low_level_execute_command(): starting 11511 1726773111.13555: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/AnsiballZ_command.py && sleep 0' 11511 1726773112.44365: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:51.284876", "end": "2024-09-19 15:11:52.440321", "delta": "0:00:01.155445", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11511 1726773112.45486: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11511 1726773112.45536: stderr chunk (state=3): >>><<< 11511 1726773112.45543: stdout chunk (state=3): >>><<< 11511 1726773112.45558: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "profile", "virtual-guest kernel_settings"], "start": "2024-09-19 15:11:51.284876", "end": "2024-09-19 15:11:52.440321", "delta": "0:00:01.155445", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm profile 'virtual-guest kernel_settings'", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11511 1726773112.45588: done with _execute_module (ansible.legacy.command, {'_raw_params': "tuned-adm profile 'virtual-guest kernel_settings'", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11511 1726773112.45598: _low_level_execute_command(): starting 11511 1726773112.45604: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773111.0433347-11511-1660756481074/ > /dev/null 2>&1 && sleep 0' 11511 1726773112.48027: stderr chunk (state=2): >>><<< 11511 1726773112.48036: stdout chunk (state=2): >>><<< 11511 1726773112.48050: _low_level_execute_command() done: rc=0, stdout=, stderr= 11511 1726773112.48057: handler run complete 11511 1726773112.48074: Evaluated conditional (True): True 11511 1726773112.48084: attempt loop complete, returning result 11511 1726773112.48090: _execute() done 11511 1726773112.48094: dumping result to json 11511 1726773112.48099: done dumping result, returning 11511 1726773112.48108: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-885f-bbcf-000000000ee5] 11511 1726773112.48115: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee5 11511 1726773112.48143: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee5 11511 1726773112.48147: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "cmd": [ "tuned-adm", "profile", "virtual-guest kernel_settings" ], "delta": "0:00:01.155445", "end": "2024-09-19 15:11:52.440321", "rc": 0, "start": "2024-09-19 15:11:51.284876" } 8240 1726773112.48296: no more pending results, returning what we have 8240 1726773112.48300: results queue empty 8240 1726773112.48301: checking for any_errors_fatal 8240 1726773112.48309: done checking for any_errors_fatal 8240 1726773112.48309: checking for max_fail_percentage 8240 1726773112.48311: done checking for max_fail_percentage 8240 1726773112.48312: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.48313: done checking to see if all hosts have failed 8240 1726773112.48313: getting the remaining hosts for this loop 8240 1726773112.48314: done getting the remaining hosts for this loop 8240 1726773112.48317: getting the next task for host managed_node2 8240 1726773112.48325: done getting next task for host managed_node2 8240 1726773112.48328: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8240 1726773112.48332: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.48343: getting variables 8240 1726773112.48344: in VariableManager get_vars() 8240 1726773112.48380: Calling all_inventory to load vars for managed_node2 8240 1726773112.48383: Calling groups_inventory to load vars for managed_node2 8240 1726773112.48387: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.48397: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.48400: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.48403: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.48525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.48718: done with get_vars() 8240 1726773112.48726: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:11:52 -0400 (0:00:01.508) 0:01:31.131 **** 8240 1726773112.48796: entering _queue_task() for managed_node2/include_tasks 8240 1726773112.48965: worker is 1 (out of 1 available) 8240 1726773112.48980: exiting _queue_task() for managed_node2/include_tasks 8240 1726773112.48995: done queuing things up, now waiting for results queue to drain 8240 1726773112.48997: waiting for pending results... 11525 1726773112.49132: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 11525 1726773112.49266: in run() - task 0affffe7-6841-885f-bbcf-000000000ee6 11525 1726773112.49283: variable 'ansible_search_path' from source: unknown 11525 1726773112.49290: variable 'ansible_search_path' from source: unknown 11525 1726773112.49320: calling self._execute() 11525 1726773112.49390: variable 'ansible_host' from source: host vars for 'managed_node2' 11525 1726773112.49398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11525 1726773112.49410: variable 'omit' from source: magic vars 11525 1726773112.49742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11525 1726773112.49929: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11525 1726773112.49965: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11525 1726773112.49996: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11525 1726773112.50027: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11525 1726773112.50116: variable '__kernel_settings_register_apply' from source: set_fact 11525 1726773112.50140: Evaluated conditional (__kernel_settings_register_apply is changed): True 11525 1726773112.50146: _execute() done 11525 1726773112.50151: dumping result to json 11525 1726773112.50155: done dumping result, returning 11525 1726773112.50161: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-885f-bbcf-000000000ee6] 11525 1726773112.50166: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee6 11525 1726773112.50191: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee6 11525 1726773112.50194: WORKER PROCESS EXITING 8240 1726773112.50304: no more pending results, returning what we have 8240 1726773112.50308: in VariableManager get_vars() 8240 1726773112.50346: Calling all_inventory to load vars for managed_node2 8240 1726773112.50349: Calling groups_inventory to load vars for managed_node2 8240 1726773112.50351: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.50361: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.50364: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.50366: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.50488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.50603: done with get_vars() 8240 1726773112.50609: variable 'ansible_search_path' from source: unknown 8240 1726773112.50609: variable 'ansible_search_path' from source: unknown 8240 1726773112.50632: we have included files to process 8240 1726773112.50633: generating all_blocks data 8240 1726773112.50638: done generating all_blocks data 8240 1726773112.50642: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773112.50642: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8240 1726773112.50644: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node2 8240 1726773112.50899: done processing included file 8240 1726773112.50903: iterating over new_blocks loaded from include file 8240 1726773112.50904: in VariableManager get_vars() 8240 1726773112.50921: done with get_vars() 8240 1726773112.50922: filtering new block on tags 8240 1726773112.50958: done filtering new block on tags 8240 1726773112.50960: done iterating over new_blocks loaded from include file 8240 1726773112.50960: extending task lists for all hosts with included blocks 8240 1726773112.51556: done extending task lists 8240 1726773112.51557: done processing included files 8240 1726773112.51558: results queue empty 8240 1726773112.51558: checking for any_errors_fatal 8240 1726773112.51561: done checking for any_errors_fatal 8240 1726773112.51562: checking for max_fail_percentage 8240 1726773112.51562: done checking for max_fail_percentage 8240 1726773112.51563: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.51563: done checking to see if all hosts have failed 8240 1726773112.51564: getting the remaining hosts for this loop 8240 1726773112.51564: done getting the remaining hosts for this loop 8240 1726773112.51566: getting the next task for host managed_node2 8240 1726773112.51569: done getting next task for host managed_node2 8240 1726773112.51571: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8240 1726773112.51574: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.51581: getting variables 8240 1726773112.51581: in VariableManager get_vars() 8240 1726773112.51592: Calling all_inventory to load vars for managed_node2 8240 1726773112.51593: Calling groups_inventory to load vars for managed_node2 8240 1726773112.51594: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.51597: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.51599: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.51600: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.51677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.51786: done with get_vars() 8240 1726773112.51793: done getting variables 8240 1726773112.51820: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:11:52 -0400 (0:00:00.030) 0:01:31.162 **** 8240 1726773112.51847: entering _queue_task() for managed_node2/command 8240 1726773112.52011: worker is 1 (out of 1 available) 8240 1726773112.52025: exiting _queue_task() for managed_node2/command 8240 1726773112.52038: done queuing things up, now waiting for results queue to drain 8240 1726773112.52040: waiting for pending results... 11526 1726773112.52175: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 11526 1726773112.52310: in run() - task 0affffe7-6841-885f-bbcf-000000000fc5 11526 1726773112.52326: variable 'ansible_search_path' from source: unknown 11526 1726773112.52330: variable 'ansible_search_path' from source: unknown 11526 1726773112.52357: calling self._execute() 11526 1726773112.52424: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726773112.52432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726773112.52441: variable 'omit' from source: magic vars 11526 1726773112.52515: variable 'omit' from source: magic vars 11526 1726773112.52559: variable 'omit' from source: magic vars 11526 1726773112.52583: variable 'omit' from source: magic vars 11526 1726773112.52616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11526 1726773112.52641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11526 1726773112.52658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11526 1726773112.52671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11526 1726773112.52680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11526 1726773112.52705: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11526 1726773112.52710: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726773112.52712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726773112.52778: Set connection var ansible_pipelining to False 11526 1726773112.52784: Set connection var ansible_timeout to 10 11526 1726773112.52791: Set connection var ansible_module_compression to ZIP_DEFLATED 11526 1726773112.52793: Set connection var ansible_shell_type to sh 11526 1726773112.52796: Set connection var ansible_shell_executable to /bin/sh 11526 1726773112.52799: Set connection var ansible_connection to ssh 11526 1726773112.52814: variable 'ansible_shell_executable' from source: unknown 11526 1726773112.52817: variable 'ansible_connection' from source: unknown 11526 1726773112.52819: variable 'ansible_module_compression' from source: unknown 11526 1726773112.52821: variable 'ansible_shell_type' from source: unknown 11526 1726773112.52823: variable 'ansible_shell_executable' from source: unknown 11526 1726773112.52825: variable 'ansible_host' from source: host vars for 'managed_node2' 11526 1726773112.52827: variable 'ansible_pipelining' from source: unknown 11526 1726773112.52829: variable 'ansible_timeout' from source: unknown 11526 1726773112.52831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11526 1726773112.52938: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11526 1726773112.52951: variable 'omit' from source: magic vars 11526 1726773112.52957: starting attempt loop 11526 1726773112.52960: running the handler 11526 1726773112.52973: _low_level_execute_command(): starting 11526 1726773112.52980: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11526 1726773112.55339: stdout chunk (state=2): >>>/root <<< 11526 1726773112.55461: stderr chunk (state=3): >>><<< 11526 1726773112.55468: stdout chunk (state=3): >>><<< 11526 1726773112.55489: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11526 1726773112.55504: _low_level_execute_command(): starting 11526 1726773112.55510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095 `" && echo ansible-tmp-1726773112.554977-11526-264247236150095="` echo /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095 `" ) && sleep 0' 11526 1726773112.58320: stdout chunk (state=2): >>>ansible-tmp-1726773112.554977-11526-264247236150095=/root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095 <<< 11526 1726773112.58451: stderr chunk (state=3): >>><<< 11526 1726773112.58459: stdout chunk (state=3): >>><<< 11526 1726773112.58475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773112.554977-11526-264247236150095=/root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095 , stderr= 11526 1726773112.58502: variable 'ansible_module_compression' from source: unknown 11526 1726773112.58551: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11526 1726773112.58590: variable 'ansible_facts' from source: unknown 11526 1726773112.58652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/AnsiballZ_command.py 11526 1726773112.58756: Sending initial data 11526 1726773112.58763: Sent initial data (154 bytes) 11526 1726773112.61282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmprme57s39 /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/AnsiballZ_command.py <<< 11526 1726773112.62383: stderr chunk (state=3): >>><<< 11526 1726773112.62393: stdout chunk (state=3): >>><<< 11526 1726773112.62416: done transferring module to remote 11526 1726773112.62427: _low_level_execute_command(): starting 11526 1726773112.62432: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/ /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/AnsiballZ_command.py && sleep 0' 11526 1726773112.64788: stderr chunk (state=2): >>><<< 11526 1726773112.64797: stdout chunk (state=2): >>><<< 11526 1726773112.64813: _low_level_execute_command() done: rc=0, stdout=, stderr= 11526 1726773112.64818: _low_level_execute_command(): starting 11526 1726773112.64823: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/AnsiballZ_command.py && sleep 0' 11526 1726773112.90461: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:52.797096", "end": "2024-09-19 15:11:52.899228", "delta": "0:00:00.102132", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11526 1726773112.91324: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11526 1726773112.91372: stderr chunk (state=3): >>><<< 11526 1726773112.91378: stdout chunk (state=3): >>><<< 11526 1726773112.91399: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:11:52.797096", "end": "2024-09-19 15:11:52.899228", "delta": "0:00:00.102132", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11526 1726773112.91442: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11526 1726773112.91453: _low_level_execute_command(): starting 11526 1726773112.91459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773112.554977-11526-264247236150095/ > /dev/null 2>&1 && sleep 0' 11526 1726773112.93895: stderr chunk (state=2): >>><<< 11526 1726773112.93903: stdout chunk (state=2): >>><<< 11526 1726773112.93918: _low_level_execute_command() done: rc=0, stdout=, stderr= 11526 1726773112.93925: handler run complete 11526 1726773112.93946: Evaluated conditional (False): False 11526 1726773112.93956: attempt loop complete, returning result 11526 1726773112.93960: _execute() done 11526 1726773112.93963: dumping result to json 11526 1726773112.93969: done dumping result, returning 11526 1726773112.93976: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-885f-bbcf-000000000fc5] 11526 1726773112.93982: sending task result for task 0affffe7-6841-885f-bbcf-000000000fc5 11526 1726773112.94015: done sending task result for task 0affffe7-6841-885f-bbcf-000000000fc5 11526 1726773112.94019: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.102132", "end": "2024-09-19 15:11:52.899228", "rc": 0, "start": "2024-09-19 15:11:52.797096" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8240 1726773112.94180: no more pending results, returning what we have 8240 1726773112.94184: results queue empty 8240 1726773112.94187: checking for any_errors_fatal 8240 1726773112.94189: done checking for any_errors_fatal 8240 1726773112.94190: checking for max_fail_percentage 8240 1726773112.94191: done checking for max_fail_percentage 8240 1726773112.94192: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.94193: done checking to see if all hosts have failed 8240 1726773112.94194: getting the remaining hosts for this loop 8240 1726773112.94195: done getting the remaining hosts for this loop 8240 1726773112.94198: getting the next task for host managed_node2 8240 1726773112.94208: done getting next task for host managed_node2 8240 1726773112.94212: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8240 1726773112.94217: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.94227: getting variables 8240 1726773112.94229: in VariableManager get_vars() 8240 1726773112.94264: Calling all_inventory to load vars for managed_node2 8240 1726773112.94267: Calling groups_inventory to load vars for managed_node2 8240 1726773112.94268: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.94276: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.94278: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.94280: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.94396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.94554: done with get_vars() 8240 1726773112.94563: done getting variables 8240 1726773112.94610: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:11:52 -0400 (0:00:00.427) 0:01:31.590 **** 8240 1726773112.94636: entering _queue_task() for managed_node2/shell 8240 1726773112.94813: worker is 1 (out of 1 available) 8240 1726773112.94827: exiting _queue_task() for managed_node2/shell 8240 1726773112.94839: done queuing things up, now waiting for results queue to drain 8240 1726773112.94842: waiting for pending results... 11534 1726773112.94973: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 11534 1726773112.95112: in run() - task 0affffe7-6841-885f-bbcf-000000000fc6 11534 1726773112.95128: variable 'ansible_search_path' from source: unknown 11534 1726773112.95132: variable 'ansible_search_path' from source: unknown 11534 1726773112.95161: calling self._execute() 11534 1726773112.95234: variable 'ansible_host' from source: host vars for 'managed_node2' 11534 1726773112.95243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11534 1726773112.95251: variable 'omit' from source: magic vars 11534 1726773112.95590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11534 1726773112.95777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11534 1726773112.95816: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11534 1726773112.95844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11534 1726773112.95871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11534 1726773112.95960: variable '__kernel_settings_register_verify_values' from source: set_fact 11534 1726773112.95984: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11534 1726773112.95991: when evaluation is False, skipping this task 11534 1726773112.95995: _execute() done 11534 1726773112.95999: dumping result to json 11534 1726773112.96003: done dumping result, returning 11534 1726773112.96009: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-885f-bbcf-000000000fc6] 11534 1726773112.96015: sending task result for task 0affffe7-6841-885f-bbcf-000000000fc6 11534 1726773112.96038: done sending task result for task 0affffe7-6841-885f-bbcf-000000000fc6 11534 1726773112.96041: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773112.96152: no more pending results, returning what we have 8240 1726773112.96156: results queue empty 8240 1726773112.96157: checking for any_errors_fatal 8240 1726773112.96166: done checking for any_errors_fatal 8240 1726773112.96167: checking for max_fail_percentage 8240 1726773112.96169: done checking for max_fail_percentage 8240 1726773112.96169: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.96170: done checking to see if all hosts have failed 8240 1726773112.96171: getting the remaining hosts for this loop 8240 1726773112.96172: done getting the remaining hosts for this loop 8240 1726773112.96175: getting the next task for host managed_node2 8240 1726773112.96182: done getting next task for host managed_node2 8240 1726773112.96188: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8240 1726773112.96192: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.96211: getting variables 8240 1726773112.96213: in VariableManager get_vars() 8240 1726773112.96244: Calling all_inventory to load vars for managed_node2 8240 1726773112.96247: Calling groups_inventory to load vars for managed_node2 8240 1726773112.96249: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.96256: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.96258: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.96259: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.96368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.96493: done with get_vars() 8240 1726773112.96503: done getting variables 8240 1726773112.96545: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:11:52 -0400 (0:00:00.019) 0:01:31.609 **** 8240 1726773112.96570: entering _queue_task() for managed_node2/fail 8240 1726773112.96730: worker is 1 (out of 1 available) 8240 1726773112.96745: exiting _queue_task() for managed_node2/fail 8240 1726773112.96757: done queuing things up, now waiting for results queue to drain 8240 1726773112.96759: waiting for pending results... 11535 1726773112.96887: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 11535 1726773112.97019: in run() - task 0affffe7-6841-885f-bbcf-000000000fc7 11535 1726773112.97037: variable 'ansible_search_path' from source: unknown 11535 1726773112.97042: variable 'ansible_search_path' from source: unknown 11535 1726773112.97068: calling self._execute() 11535 1726773112.97139: variable 'ansible_host' from source: host vars for 'managed_node2' 11535 1726773112.97147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11535 1726773112.97155: variable 'omit' from source: magic vars 11535 1726773112.97487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11535 1726773112.97721: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11535 1726773112.97756: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11535 1726773112.97781: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11535 1726773112.97808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11535 1726773112.97886: variable '__kernel_settings_register_verify_values' from source: set_fact 11535 1726773112.97907: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 11535 1726773112.97910: when evaluation is False, skipping this task 11535 1726773112.97913: _execute() done 11535 1726773112.97915: dumping result to json 11535 1726773112.97920: done dumping result, returning 11535 1726773112.97925: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-885f-bbcf-000000000fc7] 11535 1726773112.97929: sending task result for task 0affffe7-6841-885f-bbcf-000000000fc7 11535 1726773112.97947: done sending task result for task 0affffe7-6841-885f-bbcf-000000000fc7 11535 1726773112.97949: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8240 1726773112.98111: no more pending results, returning what we have 8240 1726773112.98114: results queue empty 8240 1726773112.98115: checking for any_errors_fatal 8240 1726773112.98120: done checking for any_errors_fatal 8240 1726773112.98121: checking for max_fail_percentage 8240 1726773112.98122: done checking for max_fail_percentage 8240 1726773112.98123: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.98124: done checking to see if all hosts have failed 8240 1726773112.98124: getting the remaining hosts for this loop 8240 1726773112.98125: done getting the remaining hosts for this loop 8240 1726773112.98128: getting the next task for host managed_node2 8240 1726773112.98134: done getting next task for host managed_node2 8240 1726773112.98136: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8240 1726773112.98139: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.98152: getting variables 8240 1726773112.98153: in VariableManager get_vars() 8240 1726773112.98177: Calling all_inventory to load vars for managed_node2 8240 1726773112.98179: Calling groups_inventory to load vars for managed_node2 8240 1726773112.98180: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.98190: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.98192: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.98194: Calling groups_plugins_play to load vars for managed_node2 8240 1726773112.98346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773112.98466: done with get_vars() 8240 1726773112.98472: done getting variables 8240 1726773112.98518: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:11:52 -0400 (0:00:00.019) 0:01:31.629 **** 8240 1726773112.98540: entering _queue_task() for managed_node2/set_fact 8240 1726773112.98699: worker is 1 (out of 1 available) 8240 1726773112.98715: exiting _queue_task() for managed_node2/set_fact 8240 1726773112.98728: done queuing things up, now waiting for results queue to drain 8240 1726773112.98730: waiting for pending results... 11536 1726773112.98855: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 11536 1726773112.98974: in run() - task 0affffe7-6841-885f-bbcf-000000000ee7 11536 1726773112.98991: variable 'ansible_search_path' from source: unknown 11536 1726773112.98995: variable 'ansible_search_path' from source: unknown 11536 1726773112.99022: calling self._execute() 11536 1726773112.99090: variable 'ansible_host' from source: host vars for 'managed_node2' 11536 1726773112.99099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11536 1726773112.99108: variable 'omit' from source: magic vars 11536 1726773112.99180: variable 'omit' from source: magic vars 11536 1726773112.99222: variable 'omit' from source: magic vars 11536 1726773112.99245: variable 'omit' from source: magic vars 11536 1726773112.99278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11536 1726773112.99310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11536 1726773112.99329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11536 1726773112.99345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11536 1726773112.99356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11536 1726773112.99380: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11536 1726773112.99387: variable 'ansible_host' from source: host vars for 'managed_node2' 11536 1726773112.99391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11536 1726773112.99462: Set connection var ansible_pipelining to False 11536 1726773112.99469: Set connection var ansible_timeout to 10 11536 1726773112.99477: Set connection var ansible_module_compression to ZIP_DEFLATED 11536 1726773112.99480: Set connection var ansible_shell_type to sh 11536 1726773112.99487: Set connection var ansible_shell_executable to /bin/sh 11536 1726773112.99492: Set connection var ansible_connection to ssh 11536 1726773112.99510: variable 'ansible_shell_executable' from source: unknown 11536 1726773112.99515: variable 'ansible_connection' from source: unknown 11536 1726773112.99518: variable 'ansible_module_compression' from source: unknown 11536 1726773112.99522: variable 'ansible_shell_type' from source: unknown 11536 1726773112.99525: variable 'ansible_shell_executable' from source: unknown 11536 1726773112.99529: variable 'ansible_host' from source: host vars for 'managed_node2' 11536 1726773112.99533: variable 'ansible_pipelining' from source: unknown 11536 1726773112.99536: variable 'ansible_timeout' from source: unknown 11536 1726773112.99540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11536 1726773112.99635: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11536 1726773112.99647: variable 'omit' from source: magic vars 11536 1726773112.99653: starting attempt loop 11536 1726773112.99657: running the handler 11536 1726773112.99667: handler run complete 11536 1726773112.99676: attempt loop complete, returning result 11536 1726773112.99679: _execute() done 11536 1726773112.99682: dumping result to json 11536 1726773112.99687: done dumping result, returning 11536 1726773112.99694: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-885f-bbcf-000000000ee7] 11536 1726773112.99699: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee7 11536 1726773112.99721: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee7 11536 1726773112.99725: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8240 1726773112.99875: no more pending results, returning what we have 8240 1726773112.99878: results queue empty 8240 1726773112.99879: checking for any_errors_fatal 8240 1726773112.99884: done checking for any_errors_fatal 8240 1726773112.99884: checking for max_fail_percentage 8240 1726773112.99887: done checking for max_fail_percentage 8240 1726773112.99888: checking to see if all hosts have failed and the running result is not ok 8240 1726773112.99888: done checking to see if all hosts have failed 8240 1726773112.99889: getting the remaining hosts for this loop 8240 1726773112.99890: done getting the remaining hosts for this loop 8240 1726773112.99892: getting the next task for host managed_node2 8240 1726773112.99897: done getting next task for host managed_node2 8240 1726773112.99899: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8240 1726773112.99904: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773112.99913: getting variables 8240 1726773112.99914: in VariableManager get_vars() 8240 1726773112.99938: Calling all_inventory to load vars for managed_node2 8240 1726773112.99940: Calling groups_inventory to load vars for managed_node2 8240 1726773112.99941: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773112.99948: Calling all_plugins_play to load vars for managed_node2 8240 1726773112.99950: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773112.99951: Calling groups_plugins_play to load vars for managed_node2 8240 1726773113.00056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773113.00175: done with get_vars() 8240 1726773113.00182: done getting variables 8240 1726773113.00226: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:11:53 -0400 (0:00:00.017) 0:01:31.646 **** 8240 1726773113.00249: entering _queue_task() for managed_node2/set_fact 8240 1726773113.00406: worker is 1 (out of 1 available) 8240 1726773113.00420: exiting _queue_task() for managed_node2/set_fact 8240 1726773113.00433: done queuing things up, now waiting for results queue to drain 8240 1726773113.00435: waiting for pending results... 11537 1726773113.00557: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 11537 1726773113.00669: in run() - task 0affffe7-6841-885f-bbcf-000000000ee8 11537 1726773113.00687: variable 'ansible_search_path' from source: unknown 11537 1726773113.00691: variable 'ansible_search_path' from source: unknown 11537 1726773113.00718: calling self._execute() 11537 1726773113.00787: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726773113.00796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726773113.00805: variable 'omit' from source: magic vars 11537 1726773113.00886: variable 'omit' from source: magic vars 11537 1726773113.00927: variable 'omit' from source: magic vars 11537 1726773113.01195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11537 1726773113.01439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11537 1726773113.01471: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11537 1726773113.01499: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11537 1726773113.01525: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11537 1726773113.01636: variable '__kernel_settings_register_profile' from source: set_fact 11537 1726773113.01649: variable '__kernel_settings_register_mode' from source: set_fact 11537 1726773113.01658: variable '__kernel_settings_register_apply' from source: set_fact 11537 1726773113.01694: variable 'omit' from source: magic vars 11537 1726773113.01716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11537 1726773113.01738: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11537 1726773113.01754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11537 1726773113.01768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11537 1726773113.01778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11537 1726773113.01803: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11537 1726773113.01808: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726773113.01813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726773113.01877: Set connection var ansible_pipelining to False 11537 1726773113.01886: Set connection var ansible_timeout to 10 11537 1726773113.01894: Set connection var ansible_module_compression to ZIP_DEFLATED 11537 1726773113.01897: Set connection var ansible_shell_type to sh 11537 1726773113.01903: Set connection var ansible_shell_executable to /bin/sh 11537 1726773113.01908: Set connection var ansible_connection to ssh 11537 1726773113.01923: variable 'ansible_shell_executable' from source: unknown 11537 1726773113.01928: variable 'ansible_connection' from source: unknown 11537 1726773113.01931: variable 'ansible_module_compression' from source: unknown 11537 1726773113.01935: variable 'ansible_shell_type' from source: unknown 11537 1726773113.01938: variable 'ansible_shell_executable' from source: unknown 11537 1726773113.01942: variable 'ansible_host' from source: host vars for 'managed_node2' 11537 1726773113.01946: variable 'ansible_pipelining' from source: unknown 11537 1726773113.01949: variable 'ansible_timeout' from source: unknown 11537 1726773113.01953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11537 1726773113.02022: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11537 1726773113.02033: variable 'omit' from source: magic vars 11537 1726773113.02039: starting attempt loop 11537 1726773113.02043: running the handler 11537 1726773113.02052: handler run complete 11537 1726773113.02060: attempt loop complete, returning result 11537 1726773113.02063: _execute() done 11537 1726773113.02066: dumping result to json 11537 1726773113.02069: done dumping result, returning 11537 1726773113.02075: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-885f-bbcf-000000000ee8] 11537 1726773113.02081: sending task result for task 0affffe7-6841-885f-bbcf-000000000ee8 11537 1726773113.02104: done sending task result for task 0affffe7-6841-885f-bbcf-000000000ee8 11537 1726773113.02108: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8240 1726773113.02238: no more pending results, returning what we have 8240 1726773113.02241: results queue empty 8240 1726773113.02242: checking for any_errors_fatal 8240 1726773113.02247: done checking for any_errors_fatal 8240 1726773113.02248: checking for max_fail_percentage 8240 1726773113.02249: done checking for max_fail_percentage 8240 1726773113.02250: checking to see if all hosts have failed and the running result is not ok 8240 1726773113.02251: done checking to see if all hosts have failed 8240 1726773113.02252: getting the remaining hosts for this loop 8240 1726773113.02253: done getting the remaining hosts for this loop 8240 1726773113.02256: getting the next task for host managed_node2 8240 1726773113.02265: done getting next task for host managed_node2 8240 1726773113.02266: ^ task is: TASK: meta (role_complete) 8240 1726773113.02269: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773113.02280: getting variables 8240 1726773113.02281: in VariableManager get_vars() 8240 1726773113.02318: Calling all_inventory to load vars for managed_node2 8240 1726773113.02321: Calling groups_inventory to load vars for managed_node2 8240 1726773113.02322: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773113.02330: Calling all_plugins_play to load vars for managed_node2 8240 1726773113.02331: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773113.02333: Calling groups_plugins_play to load vars for managed_node2 8240 1726773113.02444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773113.02604: done with get_vars() 8240 1726773113.02611: done getting variables 8240 1726773113.02663: done queuing things up, now waiting for results queue to drain 8240 1726773113.02669: results queue empty 8240 1726773113.02669: checking for any_errors_fatal 8240 1726773113.02672: done checking for any_errors_fatal 8240 1726773113.02673: checking for max_fail_percentage 8240 1726773113.02673: done checking for max_fail_percentage 8240 1726773113.02673: checking to see if all hosts have failed and the running result is not ok 8240 1726773113.02674: done checking to see if all hosts have failed 8240 1726773113.02674: getting the remaining hosts for this loop 8240 1726773113.02675: done getting the remaining hosts for this loop 8240 1726773113.02676: getting the next task for host managed_node2 8240 1726773113.02679: done getting next task for host managed_node2 8240 1726773113.02680: ^ task is: TASK: Verify no settings 8240 1726773113.02681: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773113.02683: getting variables 8240 1726773113.02683: in VariableManager get_vars() 8240 1726773113.02693: Calling all_inventory to load vars for managed_node2 8240 1726773113.02695: Calling groups_inventory to load vars for managed_node2 8240 1726773113.02696: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773113.02699: Calling all_plugins_play to load vars for managed_node2 8240 1726773113.02700: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773113.02704: Calling groups_plugins_play to load vars for managed_node2 8240 1726773113.02779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773113.02884: done with get_vars() 8240 1726773113.02892: done getting variables 8240 1726773113.02919: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 15:11:53 -0400 (0:00:00.026) 0:01:31.673 **** 8240 1726773113.02940: entering _queue_task() for managed_node2/shell 8240 1726773113.03104: worker is 1 (out of 1 available) 8240 1726773113.03119: exiting _queue_task() for managed_node2/shell 8240 1726773113.03132: done queuing things up, now waiting for results queue to drain 8240 1726773113.03134: waiting for pending results... 11538 1726773113.03260: running TaskExecutor() for managed_node2/TASK: Verify no settings 11538 1726773113.03368: in run() - task 0affffe7-6841-885f-bbcf-000000000cae 11538 1726773113.03383: variable 'ansible_search_path' from source: unknown 11538 1726773113.03389: variable 'ansible_search_path' from source: unknown 11538 1726773113.03417: calling self._execute() 11538 1726773113.03491: variable 'ansible_host' from source: host vars for 'managed_node2' 11538 1726773113.03499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11538 1726773113.03508: variable 'omit' from source: magic vars 11538 1726773113.03587: variable 'omit' from source: magic vars 11538 1726773113.03618: variable 'omit' from source: magic vars 11538 1726773113.03867: variable '__kernel_settings_profile_filename' from source: role '' exported vars 11538 1726773113.03927: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11538 1726773113.03993: variable '__kernel_settings_profile_parent' from source: set_fact 11538 1726773113.04002: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11538 1726773113.04037: variable 'omit' from source: magic vars 11538 1726773113.04068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11538 1726773113.04097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11538 1726773113.04115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11538 1726773113.04129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11538 1726773113.04138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11538 1726773113.04159: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11538 1726773113.04162: variable 'ansible_host' from source: host vars for 'managed_node2' 11538 1726773113.04165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11538 1726773113.04242: Set connection var ansible_pipelining to False 11538 1726773113.04250: Set connection var ansible_timeout to 10 11538 1726773113.04258: Set connection var ansible_module_compression to ZIP_DEFLATED 11538 1726773113.04261: Set connection var ansible_shell_type to sh 11538 1726773113.04266: Set connection var ansible_shell_executable to /bin/sh 11538 1726773113.04271: Set connection var ansible_connection to ssh 11538 1726773113.04288: variable 'ansible_shell_executable' from source: unknown 11538 1726773113.04292: variable 'ansible_connection' from source: unknown 11538 1726773113.04296: variable 'ansible_module_compression' from source: unknown 11538 1726773113.04299: variable 'ansible_shell_type' from source: unknown 11538 1726773113.04305: variable 'ansible_shell_executable' from source: unknown 11538 1726773113.04308: variable 'ansible_host' from source: host vars for 'managed_node2' 11538 1726773113.04313: variable 'ansible_pipelining' from source: unknown 11538 1726773113.04316: variable 'ansible_timeout' from source: unknown 11538 1726773113.04321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11538 1726773113.04413: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11538 1726773113.04424: variable 'omit' from source: magic vars 11538 1726773113.04431: starting attempt loop 11538 1726773113.04434: running the handler 11538 1726773113.04443: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11538 1726773113.04460: _low_level_execute_command(): starting 11538 1726773113.04468: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11538 1726773113.06817: stdout chunk (state=2): >>>/root <<< 11538 1726773113.06942: stderr chunk (state=3): >>><<< 11538 1726773113.06949: stdout chunk (state=3): >>><<< 11538 1726773113.06968: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11538 1726773113.06981: _low_level_execute_command(): starting 11538 1726773113.06990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548 `" && echo ansible-tmp-1726773113.0697615-11538-205930205760548="` echo /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548 `" ) && sleep 0' 11538 1726773113.09584: stdout chunk (state=2): >>>ansible-tmp-1726773113.0697615-11538-205930205760548=/root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548 <<< 11538 1726773113.09715: stderr chunk (state=3): >>><<< 11538 1726773113.09722: stdout chunk (state=3): >>><<< 11538 1726773113.09736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773113.0697615-11538-205930205760548=/root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548 , stderr= 11538 1726773113.09760: variable 'ansible_module_compression' from source: unknown 11538 1726773113.09808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11538 1726773113.09839: variable 'ansible_facts' from source: unknown 11538 1726773113.09903: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/AnsiballZ_command.py 11538 1726773113.10067: Sending initial data 11538 1726773113.10075: Sent initial data (155 bytes) 11538 1726773113.12602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpjyddo0yi /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/AnsiballZ_command.py <<< 11538 1726773113.13700: stderr chunk (state=3): >>><<< 11538 1726773113.13709: stdout chunk (state=3): >>><<< 11538 1726773113.13728: done transferring module to remote 11538 1726773113.13739: _low_level_execute_command(): starting 11538 1726773113.13744: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/ /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/AnsiballZ_command.py && sleep 0' 11538 1726773113.16099: stderr chunk (state=2): >>><<< 11538 1726773113.16109: stdout chunk (state=2): >>><<< 11538 1726773113.16123: _low_level_execute_command() done: rc=0, stdout=, stderr= 11538 1726773113.16127: _low_level_execute_command(): starting 11538 1726773113.16132: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/AnsiballZ_command.py && sleep 0' 11538 1726773113.31823: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:11:53.308692", "end": "2024-09-19 15:11:53.316160", "delta": "0:00:00.007468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11538 1726773113.32951: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11538 1726773113.33001: stderr chunk (state=3): >>><<< 11538 1726773113.33009: stdout chunk (state=3): >>><<< 11538 1726773113.33025: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:11:53.308692", "end": "2024-09-19 15:11:53.316160", "delta": "0:00:00.007468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11538 1726773113.33055: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11538 1726773113.33065: _low_level_execute_command(): starting 11538 1726773113.33071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773113.0697615-11538-205930205760548/ > /dev/null 2>&1 && sleep 0' 11538 1726773113.35479: stderr chunk (state=2): >>><<< 11538 1726773113.35489: stdout chunk (state=2): >>><<< 11538 1726773113.35504: _low_level_execute_command() done: rc=0, stdout=, stderr= 11538 1726773113.35512: handler run complete 11538 1726773113.35530: Evaluated conditional (False): False 11538 1726773113.35539: attempt loop complete, returning result 11538 1726773113.35543: _execute() done 11538 1726773113.35547: dumping result to json 11538 1726773113.35553: done dumping result, returning 11538 1726773113.35559: done running TaskExecutor() for managed_node2/TASK: Verify no settings [0affffe7-6841-885f-bbcf-000000000cae] 11538 1726773113.35565: sending task result for task 0affffe7-6841-885f-bbcf-000000000cae 11538 1726773113.35599: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cae 11538 1726773113.35603: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007468", "end": "2024-09-19 15:11:53.316160", "rc": 0, "start": "2024-09-19 15:11:53.308692" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8240 1726773113.35758: no more pending results, returning what we have 8240 1726773113.35761: results queue empty 8240 1726773113.35762: checking for any_errors_fatal 8240 1726773113.35765: done checking for any_errors_fatal 8240 1726773113.35765: checking for max_fail_percentage 8240 1726773113.35767: done checking for max_fail_percentage 8240 1726773113.35768: checking to see if all hosts have failed and the running result is not ok 8240 1726773113.35769: done checking to see if all hosts have failed 8240 1726773113.35769: getting the remaining hosts for this loop 8240 1726773113.35770: done getting the remaining hosts for this loop 8240 1726773113.35774: getting the next task for host managed_node2 8240 1726773113.35781: done getting next task for host managed_node2 8240 1726773113.35783: ^ task is: TASK: Remove kernel_settings tuned profile 8240 1726773113.35787: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773113.35792: getting variables 8240 1726773113.35793: in VariableManager get_vars() 8240 1726773113.35831: Calling all_inventory to load vars for managed_node2 8240 1726773113.35834: Calling groups_inventory to load vars for managed_node2 8240 1726773113.35836: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773113.35847: Calling all_plugins_play to load vars for managed_node2 8240 1726773113.35854: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773113.35857: Calling groups_plugins_play to load vars for managed_node2 8240 1726773113.35973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773113.36134: done with get_vars() 8240 1726773113.36141: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 15:11:53 -0400 (0:00:00.332) 0:01:32.005 **** 8240 1726773113.36211: entering _queue_task() for managed_node2/file 8240 1726773113.36370: worker is 1 (out of 1 available) 8240 1726773113.36389: exiting _queue_task() for managed_node2/file 8240 1726773113.36404: done queuing things up, now waiting for results queue to drain 8240 1726773113.36405: waiting for pending results... 11549 1726773113.36535: running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile 11549 1726773113.36645: in run() - task 0affffe7-6841-885f-bbcf-000000000caf 11549 1726773113.36661: variable 'ansible_search_path' from source: unknown 11549 1726773113.36665: variable 'ansible_search_path' from source: unknown 11549 1726773113.36695: calling self._execute() 11549 1726773113.36767: variable 'ansible_host' from source: host vars for 'managed_node2' 11549 1726773113.36776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11549 1726773113.36788: variable 'omit' from source: magic vars 11549 1726773113.36865: variable 'omit' from source: magic vars 11549 1726773113.36897: variable 'omit' from source: magic vars 11549 1726773113.36920: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11549 1726773113.37144: variable '__kernel_settings_profile_dir' from source: role '' exported vars 11549 1726773113.37222: variable '__kernel_settings_profile_parent' from source: set_fact 11549 1726773113.37232: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 11549 1726773113.37267: variable 'omit' from source: magic vars 11549 1726773113.37300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11549 1726773113.37328: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11549 1726773113.37347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11549 1726773113.37362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11549 1726773113.37373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11549 1726773113.37400: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11549 1726773113.37407: variable 'ansible_host' from source: host vars for 'managed_node2' 11549 1726773113.37412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11549 1726773113.37478: Set connection var ansible_pipelining to False 11549 1726773113.37488: Set connection var ansible_timeout to 10 11549 1726773113.37496: Set connection var ansible_module_compression to ZIP_DEFLATED 11549 1726773113.37500: Set connection var ansible_shell_type to sh 11549 1726773113.37506: Set connection var ansible_shell_executable to /bin/sh 11549 1726773113.37512: Set connection var ansible_connection to ssh 11549 1726773113.37527: variable 'ansible_shell_executable' from source: unknown 11549 1726773113.37531: variable 'ansible_connection' from source: unknown 11549 1726773113.37535: variable 'ansible_module_compression' from source: unknown 11549 1726773113.37538: variable 'ansible_shell_type' from source: unknown 11549 1726773113.37542: variable 'ansible_shell_executable' from source: unknown 11549 1726773113.37545: variable 'ansible_host' from source: host vars for 'managed_node2' 11549 1726773113.37549: variable 'ansible_pipelining' from source: unknown 11549 1726773113.37552: variable 'ansible_timeout' from source: unknown 11549 1726773113.37556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11549 1726773113.37700: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11549 1726773113.37715: variable 'omit' from source: magic vars 11549 1726773113.37722: starting attempt loop 11549 1726773113.37725: running the handler 11549 1726773113.37737: _low_level_execute_command(): starting 11549 1726773113.37745: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11549 1726773113.40048: stdout chunk (state=2): >>>/root <<< 11549 1726773113.40165: stderr chunk (state=3): >>><<< 11549 1726773113.40172: stdout chunk (state=3): >>><<< 11549 1726773113.40193: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11549 1726773113.40206: _low_level_execute_command(): starting 11549 1726773113.40213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911 `" && echo ansible-tmp-1726773113.4020112-11549-31543487016911="` echo /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911 `" ) && sleep 0' 11549 1726773113.42803: stdout chunk (state=2): >>>ansible-tmp-1726773113.4020112-11549-31543487016911=/root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911 <<< 11549 1726773113.42931: stderr chunk (state=3): >>><<< 11549 1726773113.42938: stdout chunk (state=3): >>><<< 11549 1726773113.42956: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773113.4020112-11549-31543487016911=/root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911 , stderr= 11549 1726773113.42995: variable 'ansible_module_compression' from source: unknown 11549 1726773113.43041: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 11549 1726773113.43077: variable 'ansible_facts' from source: unknown 11549 1726773113.43141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/AnsiballZ_file.py 11549 1726773113.43245: Sending initial data 11549 1726773113.43252: Sent initial data (151 bytes) 11549 1726773113.45741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpkqyo143o /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/AnsiballZ_file.py <<< 11549 1726773113.46866: stderr chunk (state=3): >>><<< 11549 1726773113.46875: stdout chunk (state=3): >>><<< 11549 1726773113.46897: done transferring module to remote 11549 1726773113.46910: _low_level_execute_command(): starting 11549 1726773113.46915: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/ /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/AnsiballZ_file.py && sleep 0' 11549 1726773113.49252: stderr chunk (state=2): >>><<< 11549 1726773113.49261: stdout chunk (state=2): >>><<< 11549 1726773113.49276: _low_level_execute_command() done: rc=0, stdout=, stderr= 11549 1726773113.49280: _low_level_execute_command(): starting 11549 1726773113.49287: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/AnsiballZ_file.py && sleep 0' 11549 1726773113.64704: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11549 1726773113.65766: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11549 1726773113.65816: stderr chunk (state=3): >>><<< 11549 1726773113.65823: stdout chunk (state=3): >>><<< 11549 1726773113.65840: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11549 1726773113.65873: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11549 1726773113.65884: _low_level_execute_command(): starting 11549 1726773113.65891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773113.4020112-11549-31543487016911/ > /dev/null 2>&1 && sleep 0' 11549 1726773113.68322: stderr chunk (state=2): >>><<< 11549 1726773113.68332: stdout chunk (state=2): >>><<< 11549 1726773113.68347: _low_level_execute_command() done: rc=0, stdout=, stderr= 11549 1726773113.68354: handler run complete 11549 1726773113.68375: attempt loop complete, returning result 11549 1726773113.68379: _execute() done 11549 1726773113.68382: dumping result to json 11549 1726773113.68389: done dumping result, returning 11549 1726773113.68396: done running TaskExecutor() for managed_node2/TASK: Remove kernel_settings tuned profile [0affffe7-6841-885f-bbcf-000000000caf] 11549 1726773113.68404: sending task result for task 0affffe7-6841-885f-bbcf-000000000caf 11549 1726773113.68438: done sending task result for task 0affffe7-6841-885f-bbcf-000000000caf 11549 1726773113.68442: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8240 1726773113.68591: no more pending results, returning what we have 8240 1726773113.68595: results queue empty 8240 1726773113.68596: checking for any_errors_fatal 8240 1726773113.68607: done checking for any_errors_fatal 8240 1726773113.68607: checking for max_fail_percentage 8240 1726773113.68609: done checking for max_fail_percentage 8240 1726773113.68610: checking to see if all hosts have failed and the running result is not ok 8240 1726773113.68610: done checking to see if all hosts have failed 8240 1726773113.68611: getting the remaining hosts for this loop 8240 1726773113.68612: done getting the remaining hosts for this loop 8240 1726773113.68616: getting the next task for host managed_node2 8240 1726773113.68622: done getting next task for host managed_node2 8240 1726773113.68624: ^ task is: TASK: Get active_profile 8240 1726773113.68626: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773113.68630: getting variables 8240 1726773113.68632: in VariableManager get_vars() 8240 1726773113.68666: Calling all_inventory to load vars for managed_node2 8240 1726773113.68668: Calling groups_inventory to load vars for managed_node2 8240 1726773113.68670: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773113.68681: Calling all_plugins_play to load vars for managed_node2 8240 1726773113.68684: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773113.68688: Calling groups_plugins_play to load vars for managed_node2 8240 1726773113.68808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773113.68924: done with get_vars() 8240 1726773113.68933: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 15:11:53 -0400 (0:00:00.327) 0:01:32.333 **** 8240 1726773113.69002: entering _queue_task() for managed_node2/slurp 8240 1726773113.69168: worker is 1 (out of 1 available) 8240 1726773113.69182: exiting _queue_task() for managed_node2/slurp 8240 1726773113.69197: done queuing things up, now waiting for results queue to drain 8240 1726773113.69199: waiting for pending results... 11557 1726773113.69333: running TaskExecutor() for managed_node2/TASK: Get active_profile 11557 1726773113.69445: in run() - task 0affffe7-6841-885f-bbcf-000000000cb0 11557 1726773113.69463: variable 'ansible_search_path' from source: unknown 11557 1726773113.69467: variable 'ansible_search_path' from source: unknown 11557 1726773113.69497: calling self._execute() 11557 1726773113.69572: variable 'ansible_host' from source: host vars for 'managed_node2' 11557 1726773113.69581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11557 1726773113.69592: variable 'omit' from source: magic vars 11557 1726773113.69672: variable 'omit' from source: magic vars 11557 1726773113.69708: variable 'omit' from source: magic vars 11557 1726773113.69729: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11557 1726773113.69957: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11557 1726773113.70021: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11557 1726773113.70049: variable 'omit' from source: magic vars 11557 1726773113.70081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11557 1726773113.70114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11557 1726773113.70133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11557 1726773113.70218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11557 1726773113.70230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11557 1726773113.70254: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11557 1726773113.70259: variable 'ansible_host' from source: host vars for 'managed_node2' 11557 1726773113.70263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11557 1726773113.70335: Set connection var ansible_pipelining to False 11557 1726773113.70343: Set connection var ansible_timeout to 10 11557 1726773113.70351: Set connection var ansible_module_compression to ZIP_DEFLATED 11557 1726773113.70354: Set connection var ansible_shell_type to sh 11557 1726773113.70359: Set connection var ansible_shell_executable to /bin/sh 11557 1726773113.70364: Set connection var ansible_connection to ssh 11557 1726773113.70381: variable 'ansible_shell_executable' from source: unknown 11557 1726773113.70387: variable 'ansible_connection' from source: unknown 11557 1726773113.70391: variable 'ansible_module_compression' from source: unknown 11557 1726773113.70394: variable 'ansible_shell_type' from source: unknown 11557 1726773113.70397: variable 'ansible_shell_executable' from source: unknown 11557 1726773113.70403: variable 'ansible_host' from source: host vars for 'managed_node2' 11557 1726773113.70407: variable 'ansible_pipelining' from source: unknown 11557 1726773113.70408: variable 'ansible_timeout' from source: unknown 11557 1726773113.70411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11557 1726773113.70547: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11557 1726773113.70556: variable 'omit' from source: magic vars 11557 1726773113.70560: starting attempt loop 11557 1726773113.70562: running the handler 11557 1726773113.70572: _low_level_execute_command(): starting 11557 1726773113.70578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11557 1726773113.72908: stdout chunk (state=2): >>>/root <<< 11557 1726773113.73033: stderr chunk (state=3): >>><<< 11557 1726773113.73040: stdout chunk (state=3): >>><<< 11557 1726773113.73058: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11557 1726773113.73071: _low_level_execute_command(): starting 11557 1726773113.73077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374 `" && echo ansible-tmp-1726773113.7306597-11557-121697381081374="` echo /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374 `" ) && sleep 0' 11557 1726773113.75615: stdout chunk (state=2): >>>ansible-tmp-1726773113.7306597-11557-121697381081374=/root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374 <<< 11557 1726773113.75745: stderr chunk (state=3): >>><<< 11557 1726773113.75752: stdout chunk (state=3): >>><<< 11557 1726773113.75768: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773113.7306597-11557-121697381081374=/root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374 , stderr= 11557 1726773113.75808: variable 'ansible_module_compression' from source: unknown 11557 1726773113.75840: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 11557 1726773113.75871: variable 'ansible_facts' from source: unknown 11557 1726773113.75932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/AnsiballZ_slurp.py 11557 1726773113.76032: Sending initial data 11557 1726773113.76040: Sent initial data (153 bytes) 11557 1726773113.78541: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpqpziqf3q /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/AnsiballZ_slurp.py <<< 11557 1726773113.79619: stderr chunk (state=3): >>><<< 11557 1726773113.79628: stdout chunk (state=3): >>><<< 11557 1726773113.79647: done transferring module to remote 11557 1726773113.79658: _low_level_execute_command(): starting 11557 1726773113.79663: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/ /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/AnsiballZ_slurp.py && sleep 0' 11557 1726773113.81987: stderr chunk (state=2): >>><<< 11557 1726773113.81996: stdout chunk (state=2): >>><<< 11557 1726773113.82011: _low_level_execute_command() done: rc=0, stdout=, stderr= 11557 1726773113.82015: _low_level_execute_command(): starting 11557 1726773113.82021: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/AnsiballZ_slurp.py && sleep 0' 11557 1726773113.96812: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 11557 1726773113.97801: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11557 1726773113.97850: stderr chunk (state=3): >>><<< 11557 1726773113.97857: stdout chunk (state=3): >>><<< 11557 1726773113.97873: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.9.64 closed. 11557 1726773113.97898: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11557 1726773113.97910: _low_level_execute_command(): starting 11557 1726773113.97916: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773113.7306597-11557-121697381081374/ > /dev/null 2>&1 && sleep 0' 11557 1726773114.00330: stderr chunk (state=2): >>><<< 11557 1726773114.00340: stdout chunk (state=2): >>><<< 11557 1726773114.00356: _low_level_execute_command() done: rc=0, stdout=, stderr= 11557 1726773114.00363: handler run complete 11557 1726773114.00377: attempt loop complete, returning result 11557 1726773114.00381: _execute() done 11557 1726773114.00384: dumping result to json 11557 1726773114.00390: done dumping result, returning 11557 1726773114.00397: done running TaskExecutor() for managed_node2/TASK: Get active_profile [0affffe7-6841-885f-bbcf-000000000cb0] 11557 1726773114.00403: sending task result for task 0affffe7-6841-885f-bbcf-000000000cb0 11557 1726773114.00434: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cb0 11557 1726773114.00438: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8240 1726773114.00568: no more pending results, returning what we have 8240 1726773114.00572: results queue empty 8240 1726773114.00573: checking for any_errors_fatal 8240 1726773114.00583: done checking for any_errors_fatal 8240 1726773114.00584: checking for max_fail_percentage 8240 1726773114.00587: done checking for max_fail_percentage 8240 1726773114.00588: checking to see if all hosts have failed and the running result is not ok 8240 1726773114.00589: done checking to see if all hosts have failed 8240 1726773114.00589: getting the remaining hosts for this loop 8240 1726773114.00590: done getting the remaining hosts for this loop 8240 1726773114.00594: getting the next task for host managed_node2 8240 1726773114.00603: done getting next task for host managed_node2 8240 1726773114.00606: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8240 1726773114.00608: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773114.00612: getting variables 8240 1726773114.00614: in VariableManager get_vars() 8240 1726773114.00648: Calling all_inventory to load vars for managed_node2 8240 1726773114.00651: Calling groups_inventory to load vars for managed_node2 8240 1726773114.00653: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773114.00664: Calling all_plugins_play to load vars for managed_node2 8240 1726773114.00666: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773114.00669: Calling groups_plugins_play to load vars for managed_node2 8240 1726773114.00792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773114.00954: done with get_vars() 8240 1726773114.00962: done getting variables 8240 1726773114.01010: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 15:11:54 -0400 (0:00:00.320) 0:01:32.654 **** 8240 1726773114.01032: entering _queue_task() for managed_node2/copy 8240 1726773114.01207: worker is 1 (out of 1 available) 8240 1726773114.01222: exiting _queue_task() for managed_node2/copy 8240 1726773114.01234: done queuing things up, now waiting for results queue to drain 8240 1726773114.01237: waiting for pending results... 11565 1726773114.01359: running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile 11565 1726773114.01471: in run() - task 0affffe7-6841-885f-bbcf-000000000cb1 11565 1726773114.01489: variable 'ansible_search_path' from source: unknown 11565 1726773114.01493: variable 'ansible_search_path' from source: unknown 11565 1726773114.01523: calling self._execute() 11565 1726773114.01598: variable 'ansible_host' from source: host vars for 'managed_node2' 11565 1726773114.01608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11565 1726773114.01617: variable 'omit' from source: magic vars 11565 1726773114.01694: variable 'omit' from source: magic vars 11565 1726773114.01728: variable 'omit' from source: magic vars 11565 1726773114.01750: variable '__active_profile' from source: task vars 11565 1726773114.01974: variable '__active_profile' from source: task vars 11565 1726773114.02118: variable '__cur_profile' from source: task vars 11565 1726773114.02223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11565 1726773114.03715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11565 1726773114.04038: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11565 1726773114.04067: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11565 1726773114.04097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11565 1726773114.04119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11565 1726773114.04174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11565 1726773114.04197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11565 1726773114.04218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11565 1726773114.04245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11565 1726773114.04257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11565 1726773114.04333: variable '__kernel_settings_tuned_current_profile' from source: set_fact 11565 1726773114.04373: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11565 1726773114.04430: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 11565 1726773114.04481: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11565 1726773114.04506: variable 'omit' from source: magic vars 11565 1726773114.04527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11565 1726773114.04547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11565 1726773114.04562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11565 1726773114.04576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11565 1726773114.04587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11565 1726773114.04613: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11565 1726773114.04619: variable 'ansible_host' from source: host vars for 'managed_node2' 11565 1726773114.04623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11565 1726773114.04687: Set connection var ansible_pipelining to False 11565 1726773114.04695: Set connection var ansible_timeout to 10 11565 1726773114.04705: Set connection var ansible_module_compression to ZIP_DEFLATED 11565 1726773114.04709: Set connection var ansible_shell_type to sh 11565 1726773114.04715: Set connection var ansible_shell_executable to /bin/sh 11565 1726773114.04720: Set connection var ansible_connection to ssh 11565 1726773114.04735: variable 'ansible_shell_executable' from source: unknown 11565 1726773114.04739: variable 'ansible_connection' from source: unknown 11565 1726773114.04742: variable 'ansible_module_compression' from source: unknown 11565 1726773114.04746: variable 'ansible_shell_type' from source: unknown 11565 1726773114.04749: variable 'ansible_shell_executable' from source: unknown 11565 1726773114.04752: variable 'ansible_host' from source: host vars for 'managed_node2' 11565 1726773114.04756: variable 'ansible_pipelining' from source: unknown 11565 1726773114.04760: variable 'ansible_timeout' from source: unknown 11565 1726773114.04764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11565 1726773114.04831: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11565 1726773114.04841: variable 'omit' from source: magic vars 11565 1726773114.04847: starting attempt loop 11565 1726773114.04851: running the handler 11565 1726773114.04861: _low_level_execute_command(): starting 11565 1726773114.04868: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11565 1726773114.07157: stdout chunk (state=2): >>>/root <<< 11565 1726773114.07282: stderr chunk (state=3): >>><<< 11565 1726773114.07289: stdout chunk (state=3): >>><<< 11565 1726773114.07311: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11565 1726773114.07323: _low_level_execute_command(): starting 11565 1726773114.07329: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979 `" && echo ansible-tmp-1726773114.073193-11565-147362582385979="` echo /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979 `" ) && sleep 0' 11565 1726773114.09876: stdout chunk (state=2): >>>ansible-tmp-1726773114.073193-11565-147362582385979=/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979 <<< 11565 1726773114.10003: stderr chunk (state=3): >>><<< 11565 1726773114.10010: stdout chunk (state=3): >>><<< 11565 1726773114.10024: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773114.073193-11565-147362582385979=/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979 , stderr= 11565 1726773114.10091: variable 'ansible_module_compression' from source: unknown 11565 1726773114.10134: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11565 1726773114.10165: variable 'ansible_facts' from source: unknown 11565 1726773114.10224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_stat.py 11565 1726773114.10309: Sending initial data 11565 1726773114.10316: Sent initial data (151 bytes) 11565 1726773114.12780: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpcx8yx_zm /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_stat.py <<< 11565 1726773114.13879: stderr chunk (state=3): >>><<< 11565 1726773114.13890: stdout chunk (state=3): >>><<< 11565 1726773114.13911: done transferring module to remote 11565 1726773114.13922: _low_level_execute_command(): starting 11565 1726773114.13927: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/ /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_stat.py && sleep 0' 11565 1726773114.16276: stderr chunk (state=2): >>><<< 11565 1726773114.16289: stdout chunk (state=2): >>><<< 11565 1726773114.16306: _low_level_execute_command() done: rc=0, stdout=, stderr= 11565 1726773114.16311: _low_level_execute_command(): starting 11565 1726773114.16316: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_stat.py && sleep 0' 11565 1726773114.32659: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773113.9652689, "mtime": 1726773111.4312437, "ctime": 1726773111.4312437, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11565 1726773114.33546: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11565 1726773114.33595: stderr chunk (state=3): >>><<< 11565 1726773114.33602: stdout chunk (state=3): >>><<< 11565 1726773114.33620: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 517996678, "dev": 51713, "nlink": 1, "atime": 1726773113.9652689, "mtime": 1726773111.4312437, "ctime": 1726773111.4312437, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3155426170", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11565 1726773114.33657: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11565 1726773114.33744: Sending initial data 11565 1726773114.33752: Sent initial data (140 bytes) 11565 1726773114.36321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpowzx0wmx /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source <<< 11565 1726773114.36697: stderr chunk (state=3): >>><<< 11565 1726773114.36707: stdout chunk (state=3): >>><<< 11565 1726773114.36730: _low_level_execute_command(): starting 11565 1726773114.36737: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/ /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source && sleep 0' 11565 1726773114.39089: stderr chunk (state=2): >>><<< 11565 1726773114.39100: stdout chunk (state=2): >>><<< 11565 1726773114.39118: _low_level_execute_command() done: rc=0, stdout=, stderr= 11565 1726773114.39139: variable 'ansible_module_compression' from source: unknown 11565 1726773114.39175: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11565 1726773114.39196: variable 'ansible_facts' from source: unknown 11565 1726773114.39252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_copy.py 11565 1726773114.39343: Sending initial data 11565 1726773114.39350: Sent initial data (151 bytes) 11565 1726773114.41834: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpu6bf_9i2 /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_copy.py <<< 11565 1726773114.42969: stderr chunk (state=3): >>><<< 11565 1726773114.42980: stdout chunk (state=3): >>><<< 11565 1726773114.43002: done transferring module to remote 11565 1726773114.43012: _low_level_execute_command(): starting 11565 1726773114.43017: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/ /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_copy.py && sleep 0' 11565 1726773114.45369: stderr chunk (state=2): >>><<< 11565 1726773114.45379: stdout chunk (state=2): >>><<< 11565 1726773114.45397: _low_level_execute_command() done: rc=0, stdout=, stderr= 11565 1726773114.45402: _low_level_execute_command(): starting 11565 1726773114.45408: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/AnsiballZ_copy.py && sleep 0' 11565 1726773114.61892: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source", "_original_basename": "tmpowzx0wmx", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11565 1726773114.63086: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11565 1726773114.63133: stderr chunk (state=3): >>><<< 11565 1726773114.63140: stdout chunk (state=3): >>><<< 11565 1726773114.63155: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source", "_original_basename": "tmpowzx0wmx", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11565 1726773114.63180: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source', '_original_basename': 'tmpowzx0wmx', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11565 1726773114.63192: _low_level_execute_command(): starting 11565 1726773114.63197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/ > /dev/null 2>&1 && sleep 0' 11565 1726773114.65609: stderr chunk (state=2): >>><<< 11565 1726773114.65619: stdout chunk (state=2): >>><<< 11565 1726773114.65633: _low_level_execute_command() done: rc=0, stdout=, stderr= 11565 1726773114.65641: handler run complete 11565 1726773114.65660: attempt loop complete, returning result 11565 1726773114.65663: _execute() done 11565 1726773114.65666: dumping result to json 11565 1726773114.65673: done dumping result, returning 11565 1726773114.65680: done running TaskExecutor() for managed_node2/TASK: Ensure kernel_settings is not in active_profile [0affffe7-6841-885f-bbcf-000000000cb1] 11565 1726773114.65688: sending task result for task 0affffe7-6841-885f-bbcf-000000000cb1 11565 1726773114.65720: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cb1 11565 1726773114.65724: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726773114.073193-11565-147362582385979/source", "state": "file", "uid": 0 } 8240 1726773114.65875: no more pending results, returning what we have 8240 1726773114.65878: results queue empty 8240 1726773114.65879: checking for any_errors_fatal 8240 1726773114.65890: done checking for any_errors_fatal 8240 1726773114.65890: checking for max_fail_percentage 8240 1726773114.65892: done checking for max_fail_percentage 8240 1726773114.65893: checking to see if all hosts have failed and the running result is not ok 8240 1726773114.65894: done checking to see if all hosts have failed 8240 1726773114.65894: getting the remaining hosts for this loop 8240 1726773114.65896: done getting the remaining hosts for this loop 8240 1726773114.65899: getting the next task for host managed_node2 8240 1726773114.65906: done getting next task for host managed_node2 8240 1726773114.65909: ^ task is: TASK: Set profile_mode to auto 8240 1726773114.65911: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773114.65915: getting variables 8240 1726773114.65916: in VariableManager get_vars() 8240 1726773114.65951: Calling all_inventory to load vars for managed_node2 8240 1726773114.65954: Calling groups_inventory to load vars for managed_node2 8240 1726773114.65956: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773114.65966: Calling all_plugins_play to load vars for managed_node2 8240 1726773114.65969: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773114.65971: Calling groups_plugins_play to load vars for managed_node2 8240 1726773114.66099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773114.66213: done with get_vars() 8240 1726773114.66222: done getting variables 8240 1726773114.66265: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 15:11:54 -0400 (0:00:00.652) 0:01:33.306 **** 8240 1726773114.66290: entering _queue_task() for managed_node2/copy 8240 1726773114.66458: worker is 1 (out of 1 available) 8240 1726773114.66473: exiting _queue_task() for managed_node2/copy 8240 1726773114.66488: done queuing things up, now waiting for results queue to drain 8240 1726773114.66490: waiting for pending results... 11583 1726773114.66623: running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto 11583 1726773114.66740: in run() - task 0affffe7-6841-885f-bbcf-000000000cb2 11583 1726773114.66757: variable 'ansible_search_path' from source: unknown 11583 1726773114.66762: variable 'ansible_search_path' from source: unknown 11583 1726773114.66791: calling self._execute() 11583 1726773114.66865: variable 'ansible_host' from source: host vars for 'managed_node2' 11583 1726773114.66874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11583 1726773114.66883: variable 'omit' from source: magic vars 11583 1726773114.66965: variable 'omit' from source: magic vars 11583 1726773114.66998: variable 'omit' from source: magic vars 11583 1726773114.67022: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11583 1726773114.67249: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 11583 1726773114.67313: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 11583 1726773114.67417: variable 'omit' from source: magic vars 11583 1726773114.67450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11583 1726773114.67478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11583 1726773114.67499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11583 1726773114.67517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11583 1726773114.67528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11583 1726773114.67552: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11583 1726773114.67557: variable 'ansible_host' from source: host vars for 'managed_node2' 11583 1726773114.67562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11583 1726773114.67636: Set connection var ansible_pipelining to False 11583 1726773114.67644: Set connection var ansible_timeout to 10 11583 1726773114.67652: Set connection var ansible_module_compression to ZIP_DEFLATED 11583 1726773114.67655: Set connection var ansible_shell_type to sh 11583 1726773114.67660: Set connection var ansible_shell_executable to /bin/sh 11583 1726773114.67666: Set connection var ansible_connection to ssh 11583 1726773114.67682: variable 'ansible_shell_executable' from source: unknown 11583 1726773114.67688: variable 'ansible_connection' from source: unknown 11583 1726773114.67691: variable 'ansible_module_compression' from source: unknown 11583 1726773114.67693: variable 'ansible_shell_type' from source: unknown 11583 1726773114.67695: variable 'ansible_shell_executable' from source: unknown 11583 1726773114.67697: variable 'ansible_host' from source: host vars for 'managed_node2' 11583 1726773114.67699: variable 'ansible_pipelining' from source: unknown 11583 1726773114.67701: variable 'ansible_timeout' from source: unknown 11583 1726773114.67705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11583 1726773114.67791: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11583 1726773114.67807: variable 'omit' from source: magic vars 11583 1726773114.67813: starting attempt loop 11583 1726773114.67816: running the handler 11583 1726773114.67825: _low_level_execute_command(): starting 11583 1726773114.67830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11583 1726773114.70154: stdout chunk (state=2): >>>/root <<< 11583 1726773114.70276: stderr chunk (state=3): >>><<< 11583 1726773114.70285: stdout chunk (state=3): >>><<< 11583 1726773114.70307: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11583 1726773114.70321: _low_level_execute_command(): starting 11583 1726773114.70328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531 `" && echo ansible-tmp-1726773114.703164-11583-262098250997531="` echo /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531 `" ) && sleep 0' 11583 1726773114.72922: stdout chunk (state=2): >>>ansible-tmp-1726773114.703164-11583-262098250997531=/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531 <<< 11583 1726773114.73055: stderr chunk (state=3): >>><<< 11583 1726773114.73065: stdout chunk (state=3): >>><<< 11583 1726773114.73082: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773114.703164-11583-262098250997531=/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531 , stderr= 11583 1726773114.73156: variable 'ansible_module_compression' from source: unknown 11583 1726773114.73207: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11583 1726773114.73241: variable 'ansible_facts' from source: unknown 11583 1726773114.73307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_stat.py 11583 1726773114.73397: Sending initial data 11583 1726773114.73408: Sent initial data (151 bytes) 11583 1726773114.75892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp2xr0yzb_ /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_stat.py <<< 11583 1726773114.76992: stderr chunk (state=3): >>><<< 11583 1726773114.77003: stdout chunk (state=3): >>><<< 11583 1726773114.77024: done transferring module to remote 11583 1726773114.77035: _low_level_execute_command(): starting 11583 1726773114.77040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/ /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_stat.py && sleep 0' 11583 1726773114.79446: stderr chunk (state=2): >>><<< 11583 1726773114.79456: stdout chunk (state=2): >>><<< 11583 1726773114.79471: _low_level_execute_command() done: rc=0, stdout=, stderr= 11583 1726773114.79475: _low_level_execute_command(): starting 11583 1726773114.79480: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_stat.py && sleep 0' 11583 1726773114.95894: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773109.6472259, "mtime": 1726773111.4312437, "ctime": 1726773111.4312437, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 11583 1726773114.97059: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11583 1726773114.97113: stderr chunk (state=3): >>><<< 11583 1726773114.97120: stdout chunk (state=3): >>><<< 11583 1726773114.97136: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 524288194, "dev": 51713, "nlink": 1, "atime": 1726773109.6472259, "mtime": 1726773111.4312437, "ctime": 1726773111.4312437, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "2839214214", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.9.64 closed. 11583 1726773114.97177: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11583 1726773114.97264: Sending initial data 11583 1726773114.97271: Sent initial data (140 bytes) 11583 1726773114.99855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp5t9fm0f2 /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source <<< 11583 1726773115.00250: stderr chunk (state=3): >>><<< 11583 1726773115.00260: stdout chunk (state=3): >>><<< 11583 1726773115.00281: _low_level_execute_command(): starting 11583 1726773115.00289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/ /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source && sleep 0' 11583 1726773115.02679: stderr chunk (state=2): >>><<< 11583 1726773115.02691: stdout chunk (state=2): >>><<< 11583 1726773115.02709: _low_level_execute_command() done: rc=0, stdout=, stderr= 11583 1726773115.02730: variable 'ansible_module_compression' from source: unknown 11583 1726773115.02767: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 11583 1726773115.02790: variable 'ansible_facts' from source: unknown 11583 1726773115.02846: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_copy.py 11583 1726773115.02951: Sending initial data 11583 1726773115.02958: Sent initial data (151 bytes) 11583 1726773115.05466: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmpgtavddff /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_copy.py <<< 11583 1726773115.06614: stderr chunk (state=3): >>><<< 11583 1726773115.06625: stdout chunk (state=3): >>><<< 11583 1726773115.06645: done transferring module to remote 11583 1726773115.06655: _low_level_execute_command(): starting 11583 1726773115.06660: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/ /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_copy.py && sleep 0' 11583 1726773115.09026: stderr chunk (state=2): >>><<< 11583 1726773115.09038: stdout chunk (state=2): >>><<< 11583 1726773115.09055: _low_level_execute_command() done: rc=0, stdout=, stderr= 11583 1726773115.09059: _low_level_execute_command(): starting 11583 1726773115.09065: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/AnsiballZ_copy.py && sleep 0' 11583 1726773115.25501: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source", "_original_basename": "tmp5t9fm0f2", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 11583 1726773115.26662: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11583 1726773115.26712: stderr chunk (state=3): >>><<< 11583 1726773115.26719: stdout chunk (state=3): >>><<< 11583 1726773115.26736: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source", "_original_basename": "tmp5t9fm0f2", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11583 1726773115.26761: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source', '_original_basename': 'tmp5t9fm0f2', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11583 1726773115.26771: _low_level_execute_command(): starting 11583 1726773115.26778: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/ > /dev/null 2>&1 && sleep 0' 11583 1726773115.29195: stderr chunk (state=2): >>><<< 11583 1726773115.29207: stdout chunk (state=2): >>><<< 11583 1726773115.29222: _low_level_execute_command() done: rc=0, stdout=, stderr= 11583 1726773115.29231: handler run complete 11583 1726773115.29249: attempt loop complete, returning result 11583 1726773115.29252: _execute() done 11583 1726773115.29255: dumping result to json 11583 1726773115.29261: done dumping result, returning 11583 1726773115.29268: done running TaskExecutor() for managed_node2/TASK: Set profile_mode to auto [0affffe7-6841-885f-bbcf-000000000cb2] 11583 1726773115.29273: sending task result for task 0affffe7-6841-885f-bbcf-000000000cb2 11583 1726773115.29310: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cb2 11583 1726773115.29314: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726773114.703164-11583-262098250997531/source", "state": "file", "uid": 0 } 8240 1726773115.29459: no more pending results, returning what we have 8240 1726773115.29463: results queue empty 8240 1726773115.29465: checking for any_errors_fatal 8240 1726773115.29474: done checking for any_errors_fatal 8240 1726773115.29474: checking for max_fail_percentage 8240 1726773115.29476: done checking for max_fail_percentage 8240 1726773115.29477: checking to see if all hosts have failed and the running result is not ok 8240 1726773115.29477: done checking to see if all hosts have failed 8240 1726773115.29478: getting the remaining hosts for this loop 8240 1726773115.29479: done getting the remaining hosts for this loop 8240 1726773115.29483: getting the next task for host managed_node2 8240 1726773115.29492: done getting next task for host managed_node2 8240 1726773115.29494: ^ task is: TASK: Restart tuned 8240 1726773115.29496: ^ state is: HOST STATE: block=2, task=51, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 8240 1726773115.29500: getting variables 8240 1726773115.29502: in VariableManager get_vars() 8240 1726773115.29536: Calling all_inventory to load vars for managed_node2 8240 1726773115.29539: Calling groups_inventory to load vars for managed_node2 8240 1726773115.29541: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773115.29551: Calling all_plugins_play to load vars for managed_node2 8240 1726773115.29553: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773115.29554: Calling groups_plugins_play to load vars for managed_node2 8240 1726773115.29709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773115.29822: done with get_vars() 8240 1726773115.29830: done getting variables 8240 1726773115.29872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 15:11:55 -0400 (0:00:00.636) 0:01:33.942 **** 8240 1726773115.29896: entering _queue_task() for managed_node2/service 8240 1726773115.30064: worker is 1 (out of 1 available) 8240 1726773115.30079: exiting _queue_task() for managed_node2/service 8240 1726773115.30097: done queuing things up, now waiting for results queue to drain 8240 1726773115.30099: waiting for pending results... 11601 1726773115.30231: running TaskExecutor() for managed_node2/TASK: Restart tuned 11601 1726773115.30346: in run() - task 0affffe7-6841-885f-bbcf-000000000cb3 11601 1726773115.30362: variable 'ansible_search_path' from source: unknown 11601 1726773115.30367: variable 'ansible_search_path' from source: unknown 11601 1726773115.30406: variable '__kernel_settings_services' from source: include_vars 11601 1726773115.30658: variable '__kernel_settings_services' from source: include_vars 11601 1726773115.30722: variable 'omit' from source: magic vars 11601 1726773115.30818: variable 'ansible_host' from source: host vars for 'managed_node2' 11601 1726773115.30830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11601 1726773115.30839: variable 'omit' from source: magic vars 11601 1726773115.30893: variable 'omit' from source: magic vars 11601 1726773115.30922: variable 'omit' from source: magic vars 11601 1726773115.30951: variable 'item' from source: unknown 11601 1726773115.31013: variable 'item' from source: unknown 11601 1726773115.31035: variable 'omit' from source: magic vars 11601 1726773115.31066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11601 1726773115.31095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11601 1726773115.31115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11601 1726773115.31130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11601 1726773115.31142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11601 1726773115.31167: variable 'inventory_hostname' from source: host vars for 'managed_node2' 11601 1726773115.31172: variable 'ansible_host' from source: host vars for 'managed_node2' 11601 1726773115.31177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11601 1726773115.31249: Set connection var ansible_pipelining to False 11601 1726773115.31257: Set connection var ansible_timeout to 10 11601 1726773115.31264: Set connection var ansible_module_compression to ZIP_DEFLATED 11601 1726773115.31268: Set connection var ansible_shell_type to sh 11601 1726773115.31273: Set connection var ansible_shell_executable to /bin/sh 11601 1726773115.31278: Set connection var ansible_connection to ssh 11601 1726773115.31294: variable 'ansible_shell_executable' from source: unknown 11601 1726773115.31298: variable 'ansible_connection' from source: unknown 11601 1726773115.31301: variable 'ansible_module_compression' from source: unknown 11601 1726773115.31305: variable 'ansible_shell_type' from source: unknown 11601 1726773115.31307: variable 'ansible_shell_executable' from source: unknown 11601 1726773115.31308: variable 'ansible_host' from source: host vars for 'managed_node2' 11601 1726773115.31311: variable 'ansible_pipelining' from source: unknown 11601 1726773115.31312: variable 'ansible_timeout' from source: unknown 11601 1726773115.31314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 11601 1726773115.31407: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11601 1726773115.31417: variable 'omit' from source: magic vars 11601 1726773115.31422: starting attempt loop 11601 1726773115.31424: running the handler 11601 1726773115.31484: variable 'ansible_facts' from source: unknown 11601 1726773115.31575: _low_level_execute_command(): starting 11601 1726773115.31582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11601 1726773115.33916: stdout chunk (state=2): >>>/root <<< 11601 1726773115.34043: stderr chunk (state=3): >>><<< 11601 1726773115.34051: stdout chunk (state=3): >>><<< 11601 1726773115.34071: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 11601 1726773115.34087: _low_level_execute_command(): starting 11601 1726773115.34094: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527 `" && echo ansible-tmp-1726773115.340799-11601-196210388477527="` echo /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527 `" ) && sleep 0' 11601 1726773115.36652: stdout chunk (state=2): >>>ansible-tmp-1726773115.340799-11601-196210388477527=/root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527 <<< 11601 1726773115.36786: stderr chunk (state=3): >>><<< 11601 1726773115.36794: stdout chunk (state=3): >>><<< 11601 1726773115.36812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773115.340799-11601-196210388477527=/root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527 , stderr= 11601 1726773115.36839: variable 'ansible_module_compression' from source: unknown 11601 1726773115.36886: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8240kvoq26km/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11601 1726773115.36941: variable 'ansible_facts' from source: unknown 11601 1726773115.37104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/AnsiballZ_systemd.py 11601 1726773115.37216: Sending initial data 11601 1726773115.37223: Sent initial data (154 bytes) 11601 1726773115.39752: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8240kvoq26km/tmp208dnbtw /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/AnsiballZ_systemd.py <<< 11601 1726773115.41721: stderr chunk (state=3): >>><<< 11601 1726773115.41732: stdout chunk (state=3): >>><<< 11601 1726773115.41754: done transferring module to remote 11601 1726773115.41765: _low_level_execute_command(): starting 11601 1726773115.41771: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/ /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/AnsiballZ_systemd.py && sleep 0' 11601 1726773115.44173: stderr chunk (state=2): >>><<< 11601 1726773115.44184: stdout chunk (state=2): >>><<< 11601 1726773115.44201: _low_level_execute_command() done: rc=0, stdout=, stderr= 11601 1726773115.44207: _low_level_execute_command(): starting 11601 1726773115.44212: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/AnsiballZ_systemd.py && sleep 0' 11601 1726773115.71852: stdout chunk (state=2): >>> <<< 11601 1726773115.71897: stdout chunk (state=3): >>>{"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21086208", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_<<< 11601 1726773115.71924: stdout chunk (state=3): >>>sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11601 1726773115.73524: stderr chunk (state=3): >>>Shared connection to 10.31.9.64 closed. <<< 11601 1726773115.73571: stderr chunk (state=3): >>><<< 11601 1726773115.73578: stdout chunk (state=3): >>><<< 11601 1726773115.73600: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "671", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "21086208", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "Before": "shutdown.target multi-user.target", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.9.64 closed. 11601 1726773115.73740: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 11601 1726773115.73760: _low_level_execute_command(): starting 11601 1726773115.73766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773115.340799-11601-196210388477527/ > /dev/null 2>&1 && sleep 0' 11601 1726773115.76170: stderr chunk (state=2): >>><<< 11601 1726773115.76181: stdout chunk (state=2): >>><<< 11601 1726773115.76199: _low_level_execute_command() done: rc=0, stdout=, stderr= 11601 1726773115.76208: handler run complete 11601 1726773115.76242: attempt loop complete, returning result 11601 1726773115.76258: variable 'item' from source: unknown 11601 1726773115.76323: variable 'item' from source: unknown ok: [managed_node2] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:11:03 EDT", "ActiveEnterTimestampMonotonic": "7348255", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "dbus.service dbus.socket network.target sysinit.target system.slice systemd-journald.socket systemd-sysctl.service polkit.service basic.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:11:02 EDT", "AssertTimestampMonotonic": "6485977", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ConditionTimestampMonotonic": "6485975", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service power-profiles-daemon.service tlp.service auto-cpufreq.service shutdown.target", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "671", "ExecMainStartTimestamp": "Thu 2024-09-19 15:11:02 EDT", "ExecMainStartTimestampMonotonic": "6488426", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:11:02 EDT] ; stop_time=[n/a] ; pid=671 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:11:02 EDT", "InactiveExitTimestampMonotonic": "6488468", "InvocationID": "eb4e11a07baf44e8a558597e80e102a8", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "671", "MemoryAccounting": "yes", "MemoryCurrent": "21086208", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.socket dbus.service sysinit.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:11:03 EDT", "StateChangeTimestampMonotonic": "7348255", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:11:03 EDT", "WatchdogTimestampMonotonic": "7348252", "WatchdogUSec": "0" } } 11601 1726773115.76434: dumping result to json 11601 1726773115.76452: done dumping result, returning 11601 1726773115.76459: done running TaskExecutor() for managed_node2/TASK: Restart tuned [0affffe7-6841-885f-bbcf-000000000cb3] 11601 1726773115.76465: sending task result for task 0affffe7-6841-885f-bbcf-000000000cb3 11601 1726773115.76570: done sending task result for task 0affffe7-6841-885f-bbcf-000000000cb3 11601 1726773115.76575: WORKER PROCESS EXITING 8240 1726773115.76925: no more pending results, returning what we have 8240 1726773115.76928: results queue empty 8240 1726773115.76929: checking for any_errors_fatal 8240 1726773115.76934: done checking for any_errors_fatal 8240 1726773115.76934: checking for max_fail_percentage 8240 1726773115.76935: done checking for max_fail_percentage 8240 1726773115.76936: checking to see if all hosts have failed and the running result is not ok 8240 1726773115.76936: done checking to see if all hosts have failed 8240 1726773115.76937: getting the remaining hosts for this loop 8240 1726773115.76938: done getting the remaining hosts for this loop 8240 1726773115.76940: getting the next task for host managed_node2 8240 1726773115.76946: done getting next task for host managed_node2 8240 1726773115.76947: ^ task is: TASK: meta (flush_handlers) 8240 1726773115.76948: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773115.76952: getting variables 8240 1726773115.76953: in VariableManager get_vars() 8240 1726773115.76976: Calling all_inventory to load vars for managed_node2 8240 1726773115.76978: Calling groups_inventory to load vars for managed_node2 8240 1726773115.76979: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773115.76989: Calling all_plugins_play to load vars for managed_node2 8240 1726773115.76992: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773115.76993: Calling groups_plugins_play to load vars for managed_node2 8240 1726773115.77100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773115.77211: done with get_vars() 8240 1726773115.77219: done getting variables 8240 1726773115.77267: in VariableManager get_vars() 8240 1726773115.77276: Calling all_inventory to load vars for managed_node2 8240 1726773115.77277: Calling groups_inventory to load vars for managed_node2 8240 1726773115.77278: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773115.77281: Calling all_plugins_play to load vars for managed_node2 8240 1726773115.77282: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773115.77284: Calling groups_plugins_play to load vars for managed_node2 8240 1726773115.77569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773115.77669: done with get_vars() 8240 1726773115.77678: done queuing things up, now waiting for results queue to drain 8240 1726773115.77679: results queue empty 8240 1726773115.77679: checking for any_errors_fatal 8240 1726773115.77684: done checking for any_errors_fatal 8240 1726773115.77686: checking for max_fail_percentage 8240 1726773115.77687: done checking for max_fail_percentage 8240 1726773115.77687: checking to see if all hosts have failed and the running result is not ok 8240 1726773115.77687: done checking to see if all hosts have failed 8240 1726773115.77688: getting the remaining hosts for this loop 8240 1726773115.77688: done getting the remaining hosts for this loop 8240 1726773115.77690: getting the next task for host managed_node2 8240 1726773115.77693: done getting next task for host managed_node2 8240 1726773115.77694: ^ task is: TASK: meta (flush_handlers) 8240 1726773115.77695: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773115.77698: getting variables 8240 1726773115.77698: in VariableManager get_vars() 8240 1726773115.77706: Calling all_inventory to load vars for managed_node2 8240 1726773115.77707: Calling groups_inventory to load vars for managed_node2 8240 1726773115.77709: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773115.77712: Calling all_plugins_play to load vars for managed_node2 8240 1726773115.77713: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773115.77714: Calling groups_plugins_play to load vars for managed_node2 8240 1726773115.77787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773115.77890: done with get_vars() 8240 1726773115.77895: done getting variables 8240 1726773115.77926: in VariableManager get_vars() 8240 1726773115.77933: Calling all_inventory to load vars for managed_node2 8240 1726773115.77935: Calling groups_inventory to load vars for managed_node2 8240 1726773115.77936: Calling all_plugins_inventory to load vars for managed_node2 8240 1726773115.77938: Calling all_plugins_play to load vars for managed_node2 8240 1726773115.77939: Calling groups_plugins_inventory to load vars for managed_node2 8240 1726773115.77941: Calling groups_plugins_play to load vars for managed_node2 8240 1726773115.78015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8240 1726773115.78113: done with get_vars() 8240 1726773115.78120: done queuing things up, now waiting for results queue to drain 8240 1726773115.78122: results queue empty 8240 1726773115.78122: checking for any_errors_fatal 8240 1726773115.78124: done checking for any_errors_fatal 8240 1726773115.78124: checking for max_fail_percentage 8240 1726773115.78124: done checking for max_fail_percentage 8240 1726773115.78125: checking to see if all hosts have failed and the running result is not ok 8240 1726773115.78125: done checking to see if all hosts have failed 8240 1726773115.78125: getting the remaining hosts for this loop 8240 1726773115.78126: done getting the remaining hosts for this loop 8240 1726773115.78127: getting the next task for host managed_node2 8240 1726773115.78129: done getting next task for host managed_node2 8240 1726773115.78130: ^ task is: None 8240 1726773115.78131: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8240 1726773115.78131: done queuing things up, now waiting for results queue to drain 8240 1726773115.78132: results queue empty 8240 1726773115.78132: checking for any_errors_fatal 8240 1726773115.78132: done checking for any_errors_fatal 8240 1726773115.78133: checking for max_fail_percentage 8240 1726773115.78133: done checking for max_fail_percentage 8240 1726773115.78134: checking to see if all hosts have failed and the running result is not ok 8240 1726773115.78134: done checking to see if all hosts have failed 8240 1726773115.78135: getting the next task for host managed_node2 8240 1726773115.78137: done getting next task for host managed_node2 8240 1726773115.78137: ^ task is: None 8240 1726773115.78138: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=135 changed=19 unreachable=0 failed=0 skipped=58 rescued=0 ignored=0 Thursday 19 September 2024 15:11:55 -0400 (0:00:00.483) 0:01:34.425 **** =============================================================================== Reboot the machine - see if settings persist after reboot -------------- 23.40s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:95 Ensure required packages are installed ---------------------------------- 5.66s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:22 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 5.06s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.88s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.86s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.84s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 2.80s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Gathering Facts --------------------------------------------------------- 2.15s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:2 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.55s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.54s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.51s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 fedora.linux_system_roles.kernel_settings : Tuned apply settings -------- 1.49s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Ensure required services are enabled and started ------------------------ 1.15s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:51 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.98s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Generate a configuration for kernel settings ---------------------------- 0.81s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_change_settings.yml:45 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.77s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.74s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.71s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.71s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.69s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 8240 1726773115.78236: RUNNING CLEANUP