[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8186 1726776611.95630: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-uMf executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8186 1726776611.95915: Added group all to inventory 8186 1726776611.95916: Added group ungrouped to inventory 8186 1726776611.95919: Group all now contains ungrouped 8186 1726776611.95921: Examining possible inventory source: /tmp/kernel_settings-iny/inventory.yml 8186 1726776612.04367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8186 1726776612.04409: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8186 1726776612.04426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8186 1726776612.04468: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8186 1726776612.04515: Loaded config def from plugin (inventory/script) 8186 1726776612.04517: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8186 1726776612.04546: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8186 1726776612.04604: Loaded config def from plugin (inventory/yaml) 8186 1726776612.04606: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8186 1726776612.04668: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8186 1726776612.04943: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8186 1726776612.04945: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8186 1726776612.04948: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8186 1726776612.04952: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8186 1726776612.04957: Loading data from /tmp/kernel_settings-iny/inventory.yml 8186 1726776612.05000: /tmp/kernel_settings-iny/inventory.yml was not parsable by auto 8186 1726776612.05042: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8186 1726776612.05073: Loading data from /tmp/kernel_settings-iny/inventory.yml 8186 1726776612.05123: group all already in inventory 8186 1726776612.05130: set inventory_file for managed_node1 8186 1726776612.05134: set inventory_dir for managed_node1 8186 1726776612.05134: Added host managed_node1 to inventory 8186 1726776612.05136: Added host managed_node1 to group all 8186 1726776612.05136: set ansible_host for managed_node1 8186 1726776612.05137: set ansible_ssh_extra_args for managed_node1 8186 1726776612.05139: set inventory_file for managed_node2 8186 1726776612.05140: set inventory_dir for managed_node2 8186 1726776612.05141: Added host managed_node2 to inventory 8186 1726776612.05142: Added host managed_node2 to group all 8186 1726776612.05142: set ansible_host for managed_node2 8186 1726776612.05142: set ansible_ssh_extra_args for managed_node2 8186 1726776612.05144: set inventory_file for managed_node3 8186 1726776612.05145: set inventory_dir for managed_node3 8186 1726776612.05145: Added host managed_node3 to inventory 8186 1726776612.05146: Added host managed_node3 to group all 8186 1726776612.05147: set ansible_host for managed_node3 8186 1726776612.05147: set ansible_ssh_extra_args for managed_node3 8186 1726776612.05149: Reconcile groups and hosts in inventory. 8186 1726776612.05151: Group ungrouped now contains managed_node1 8186 1726776612.05152: Group ungrouped now contains managed_node2 8186 1726776612.05153: Group ungrouped now contains managed_node3 8186 1726776612.05207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8186 1726776612.05291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8186 1726776612.05322: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8186 1726776612.05341: Loaded config def from plugin (vars/host_group_vars) 8186 1726776612.05342: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8186 1726776612.05347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8186 1726776612.05352: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8186 1726776612.05380: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8186 1726776612.05624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776612.05691: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8186 1726776612.05715: Loaded config def from plugin (connection/local) 8186 1726776612.05717: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8186 1726776612.06039: Loaded config def from plugin (connection/paramiko_ssh) 8186 1726776612.06041: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8186 1726776612.06615: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8186 1726776612.06638: Loaded config def from plugin (connection/psrp) 8186 1726776612.06640: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8186 1726776612.07057: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8186 1726776612.07080: Loaded config def from plugin (connection/ssh) 8186 1726776612.07081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8186 1726776612.08219: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8186 1726776612.08243: Loaded config def from plugin (connection/winrm) 8186 1726776612.08245: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8186 1726776612.08268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8186 1726776612.08310: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8186 1726776612.08350: Loaded config def from plugin (shell/cmd) 8186 1726776612.08351: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8186 1726776612.08370: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8186 1726776612.08407: Loaded config def from plugin (shell/powershell) 8186 1726776612.08409: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8186 1726776612.08446: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8186 1726776612.08548: Loaded config def from plugin (shell/sh) 8186 1726776612.08550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8186 1726776612.08574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8186 1726776612.08645: Loaded config def from plugin (become/runas) 8186 1726776612.08646: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8186 1726776612.08754: Loaded config def from plugin (become/su) 8186 1726776612.08757: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8186 1726776612.08850: Loaded config def from plugin (become/sudo) 8186 1726776612.08851: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8186 1726776612.08878: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml 8186 1726776612.09200: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8186 1726776612.11154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8186 1726776612.11290: in VariableManager get_vars() 8186 1726776612.11304: done with get_vars() 8186 1726776612.11336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8186 1726776612.11345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8186 1726776612.11541: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8186 1726776612.11637: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8186 1726776612.11639: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 8186 1726776612.11661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8186 1726776612.11677: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8186 1726776612.11772: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8186 1726776612.11807: Loaded config def from plugin (callback/default) 8186 1726776612.11808: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8186 1726776612.12524: Loaded config def from plugin (callback/junit) 8186 1726776612.12526: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8186 1726776612.12559: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8186 1726776612.12596: Loaded config def from plugin (callback/minimal) 8186 1726776612.12597: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8186 1726776612.12623: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 8186 1726776612.12663: Loaded config def from plugin (callback/tree) 8186 1726776612.12664: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8186 1726776612.12866: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8186 1726776612.12868: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-uMf/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bool_not_allowed.yml ******************************************* 1 plays in /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml 8186 1726776612.12884: in VariableManager get_vars() 8186 1726776612.12894: done with get_vars() 8186 1726776612.12898: in VariableManager get_vars() 8186 1726776612.12903: done with get_vars() 8186 1726776612.12906: variable 'omit' from source: magic vars 8186 1726776612.12931: in VariableManager get_vars() 8186 1726776612.12940: done with get_vars() 8186 1726776612.12952: variable 'omit' from source: magic vars PLAY [Test boolean values not allowed] ***************************************** 8186 1726776612.13304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8186 1726776612.13355: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8186 1726776612.13378: getting the remaining hosts for this loop 8186 1726776612.13379: done getting the remaining hosts for this loop 8186 1726776612.13381: getting the next task for host managed_node1 8186 1726776612.13383: done getting next task for host managed_node1 8186 1726776612.13384: ^ task is: TASK: Gathering Facts 8186 1726776612.13385: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776612.13389: getting variables 8186 1726776612.13390: in VariableManager get_vars() 8186 1726776612.13396: Calling all_inventory to load vars for managed_node1 8186 1726776612.13398: Calling groups_inventory to load vars for managed_node1 8186 1726776612.13399: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776612.13408: Calling all_plugins_play to load vars for managed_node1 8186 1726776612.13414: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776612.13416: Calling groups_plugins_play to load vars for managed_node1 8186 1726776612.13440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776612.13470: done with get_vars() 8186 1726776612.13475: done getting variables 8186 1726776612.13519: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:2 Thursday 19 September 2024 16:10:12 -0400 (0:00:00.007) 0:00:00.007 **** 8186 1726776612.13535: entering _queue_task() for managed_node1/gather_facts 8186 1726776612.13536: Creating lock for gather_facts 8186 1726776612.13747: worker is 1 (out of 1 available) 8186 1726776612.13756: exiting _queue_task() for managed_node1/gather_facts 8186 1726776612.13768: done queuing things up, now waiting for results queue to drain 8186 1726776612.13771: waiting for pending results... 8189 1726776612.13845: running TaskExecutor() for managed_node1/TASK: Gathering Facts 8189 1726776612.13937: in run() - task 120fa90a-8a95-f1be-6eb1-00000000000d 8189 1726776612.13952: variable 'ansible_search_path' from source: unknown 8189 1726776612.13982: calling self._execute() 8189 1726776612.14021: variable 'ansible_host' from source: host vars for 'managed_node1' 8189 1726776612.14032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8189 1726776612.14040: variable 'omit' from source: magic vars 8189 1726776612.14106: variable 'omit' from source: magic vars 8189 1726776612.14126: variable 'omit' from source: magic vars 8189 1726776612.14150: variable 'omit' from source: magic vars 8189 1726776612.14184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8189 1726776612.14209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8189 1726776612.14228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8189 1726776612.14244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8189 1726776612.14254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8189 1726776612.14278: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8189 1726776612.14283: variable 'ansible_host' from source: host vars for 'managed_node1' 8189 1726776612.14288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8189 1726776612.14357: Set connection var ansible_shell_executable to /bin/sh 8189 1726776612.14365: Set connection var ansible_timeout to 10 8189 1726776612.14371: Set connection var ansible_module_compression to ZIP_DEFLATED 8189 1726776612.14375: Set connection var ansible_connection to ssh 8189 1726776612.14381: Set connection var ansible_pipelining to False 8189 1726776612.14386: Set connection var ansible_shell_type to sh 8189 1726776612.14400: variable 'ansible_shell_executable' from source: unknown 8189 1726776612.14404: variable 'ansible_connection' from source: unknown 8189 1726776612.14407: variable 'ansible_module_compression' from source: unknown 8189 1726776612.14411: variable 'ansible_shell_type' from source: unknown 8189 1726776612.14414: variable 'ansible_shell_executable' from source: unknown 8189 1726776612.14417: variable 'ansible_host' from source: host vars for 'managed_node1' 8189 1726776612.14422: variable 'ansible_pipelining' from source: unknown 8189 1726776612.14425: variable 'ansible_timeout' from source: unknown 8189 1726776612.14431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8189 1726776612.14544: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 8189 1726776612.14558: variable 'omit' from source: magic vars 8189 1726776612.14563: starting attempt loop 8189 1726776612.14567: running the handler 8189 1726776612.14578: variable 'ansible_facts' from source: unknown 8189 1726776612.14593: _low_level_execute_command(): starting 8189 1726776612.14601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8189 1726776612.17115: stderr chunk (state=2): >>>Warning: Permanently added '10.31.14.221' (ECDSA) to the list of known hosts. <<< 8189 1726776612.38485: stdout chunk (state=3): >>>/root <<< 8189 1726776612.38706: stderr chunk (state=3): >>><<< 8189 1726776612.38714: stdout chunk (state=3): >>><<< 8189 1726776612.38737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.14.221' (ECDSA) to the list of known hosts. 8189 1726776612.38751: _low_level_execute_command(): starting 8189 1726776612.38758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312 `" && echo ansible-tmp-1726776612.3874557-8189-179651865801312="` echo /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312 `" ) && sleep 0' 8189 1726776612.41277: stdout chunk (state=2): >>>ansible-tmp-1726776612.3874557-8189-179651865801312=/root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312 <<< 8189 1726776612.41405: stderr chunk (state=3): >>><<< 8189 1726776612.41411: stdout chunk (state=3): >>><<< 8189 1726776612.41426: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776612.3874557-8189-179651865801312=/root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312 , stderr= 8189 1726776612.41450: variable 'ansible_module_compression' from source: unknown 8189 1726776612.41497: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8189 1726776612.41502: ANSIBALLZ: Acquiring lock 8189 1726776612.41505: ANSIBALLZ: Lock acquired: 140184657595568 8189 1726776612.41509: ANSIBALLZ: Creating module 8189 1726776612.62570: ANSIBALLZ: Writing module into payload 8189 1726776612.62682: ANSIBALLZ: Writing module 8189 1726776612.62703: ANSIBALLZ: Renaming module 8189 1726776612.62710: ANSIBALLZ: Done creating module 8189 1726776612.62737: variable 'ansible_facts' from source: unknown 8189 1726776612.62743: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8189 1726776612.62752: _low_level_execute_command(): starting 8189 1726776612.62758: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8189 1726776612.65014: stdout chunk (state=2): >>>PLATFORM <<< 8189 1726776612.65080: stdout chunk (state=3): >>>Linux <<< 8189 1726776612.65092: stdout chunk (state=3): >>>FOUND <<< 8189 1726776612.65098: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 8189 1726776612.65117: stdout chunk (state=3): >>>/usr/bin/python3.6 <<< 8189 1726776612.65130: stdout chunk (state=3): >>>/usr/bin/python3 <<< 8189 1726776612.65142: stdout chunk (state=3): >>>/usr/libexec/platform-python ENDFOUND <<< 8189 1726776612.65281: stderr chunk (state=3): >>><<< 8189 1726776612.65287: stdout chunk (state=3): >>><<< 8189 1726776612.65300: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 8189 1726776612.65306 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 8189 1726776612.65342: _low_level_execute_command(): starting 8189 1726776612.65349: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8189 1726776612.65419: Sending initial data 8189 1726776612.65427: Sent initial data (1234 bytes) 8189 1726776612.69283: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8189 1726776612.69723: stderr chunk (state=3): >>><<< 8189 1726776612.69732: stdout chunk (state=3): >>><<< 8189 1726776612.69745: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8189 1726776612.69791: variable 'ansible_facts' from source: unknown 8189 1726776612.69796: variable 'ansible_facts' from source: unknown 8189 1726776612.69805: variable 'ansible_module_compression' from source: unknown 8189 1726776612.69837: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8189 1726776612.69863: variable 'ansible_facts' from source: unknown 8189 1726776612.70004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/AnsiballZ_setup.py 8189 1726776612.70161: Sending initial data 8189 1726776612.70167: Sent initial data (152 bytes) 8189 1726776612.73069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpsnmoid3q /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/AnsiballZ_setup.py <<< 8189 1726776612.74898: stderr chunk (state=3): >>><<< 8189 1726776612.74908: stdout chunk (state=3): >>><<< 8189 1726776612.74931: done transferring module to remote 8189 1726776612.74942: _low_level_execute_command(): starting 8189 1726776612.74947: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/ /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/AnsiballZ_setup.py && sleep 0' 8189 1726776612.77300: stderr chunk (state=2): >>><<< 8189 1726776612.77308: stdout chunk (state=2): >>><<< 8189 1726776612.77322: _low_level_execute_command() done: rc=0, stdout=, stderr= 8189 1726776612.77326: _low_level_execute_command(): starting 8189 1726776612.77333: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/AnsiballZ_setup.py && sleep 0' 8189 1726776614.26050: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-14-221.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-221", "ansible_nodename": "ip-10-31-14-221.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e3c9776c1e69462f8fd03478850b09f2", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKFVjQtW43lImBCoPGNVoENwTQqba6zckr0w/U6kq1LN4jE7JokkTicgnVgJS+ZbvIjOaQWlYnUrzb/aa8+L/VvPHUz4NFsOpvwlgIgaWZXIV0VVcISVu/FbSAVhXEb9d/YgdL9pzx21bhunmv9QbCmHB0PZ90zeuCC1ns7eLJnzAAAAFQCUoJrlxPxYm6RmyGs6hc8lPwljxQAAAIBZ+IVkORMP793XhATpQMNwZ5W/HStZj8PBaMERscturPSzKtqTWSQmJRfCrHXdRzykG72Smx8rcvmLD6qi<<< 8189 1726776614.26077: stdout chunk (state=3): >>>33ckZPWfnx3LNBQy5fjXQyvykkgSW4cLoycdXmzyS7H0qn8QR+KKKFL6mP7NDT0hicrwXVmkVDQsfPcQTbFcUOzyIQAAAIA4AlORbqA4kM3flwr+QdnkpFdEMgctEhCBl+v7pID7KLxEFO6Equ/T/0ACrXLULscUe6mXnQfDDyGJllizu4KO85xZSmPrKm2z6jD+Ty9CcnXcpGiKXmDE44aQQk1XIP9PT51wKuOZpP630eBZzhRmU6qLaf/tPI+UAZPqElTHOw==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC0YDHvFYKU0+pQ5IN5p9mUmdtS6cBTrB+ZKy0l+bjoQr+r6dFQ4q9w4cNw2qHqNVCfFdP4/vjJj9vZGg/TxfqAZCVv0/3RxEac83b+pecwnoOP+7+YSXUbHOPUf8RHKxwwCX+Bdw0CRSlZf7toMumnl8O9Ybc8O8mk/Zn/9trrMHU8xeqTQEkbQ2nPWsIK9ZRMVC/DLrYI7l5Ukh4k5AiT/6zWfrMmGhmbS5ShkCVvW/9ONR86pW6zjXBwnW8Pp8Z/n0WTX/uaCYmaeOrLWNpHmk0kFnonS22YrCjxeXT6MPG8cRBNVpVYNTkQ8EXSLM6c3Wpbei5+TgjxJgphMupYZcsDpn1TxYV8s8tJ1LQJyusuSGUqsXMqBPojCAWGCA8ighXUQu1H8iosDKRLuj17T8gzjHIreMeUrYrFSaAzTyuVeW0JhJv7RgY4125G/avxHRuqrLcVQg5dasqCsaOqukceOl+qfcOg2XTYp0QcfLwUAhCeoCWi9hvvvPN5oG8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCf3Oywqk5azQ+c4TBuAx4smbzyBAh4Jm6QjjFrxlx9akYq+rKyq+QL7Rutx8e/D/6YVUySk7xc4UFfugnqr3yA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICNZHqIVpra9FzpEhA1aDGDQvP13owEjUJoCoCNWZGKw", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2715, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 824, "free": 2715}, "nocache": {"free": 3302, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e349d-5e19-da61-d6e5-27661f684410", "ansible_product_uuid": "ec2e349d-5e19-da61-d6e5-27661f684410", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 197, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481683968, "block_size": 4096, "block_total": 65533179, "block_available": 64326583, "block_used": 1206596, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_cmdli<<< 8189 1726776614.26130: stdout chunk (state=3): >>>ne": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 43894 10.31.14.221 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 43894 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible<<< 8189 1726776614.26147: stdout chunk (state=3): >>>_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:e6:b4:51", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.221", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fee6:b451", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.221", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:e6:b4:51", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.221"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fee6:b451"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.221", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fee6:b451"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "14", "epoch": "1726776614", "epoch_int": "1726776614", "date": "2024-09-19", "time": "16:10:14", "iso8601_micro": "2024-09-19T20:10:14.256205Z", "iso8601": "2024-09-19T20:10:14Z", "iso8601_basic": "20240919T161014256205", "iso8601_basic_short": "20240919T161014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.67, "5m": 0.42, "15m": 0.18}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8189 1726776614.27787: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8189 1726776614.27832: stderr chunk (state=3): >>><<< 8189 1726776614.27838: stdout chunk (state=3): >>><<< 8189 1726776614.27863: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-14-221.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-221", "ansible_nodename": "ip-10-31-14-221.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e3c9776c1e69462f8fd03478850b09f2", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKFVjQtW43lImBCoPGNVoENwTQqba6zckr0w/U6kq1LN4jE7JokkTicgnVgJS+ZbvIjOaQWlYnUrzb/aa8+L/VvPHUz4NFsOpvwlgIgaWZXIV0VVcISVu/FbSAVhXEb9d/YgdL9pzx21bhunmv9QbCmHB0PZ90zeuCC1ns7eLJnzAAAAFQCUoJrlxPxYm6RmyGs6hc8lPwljxQAAAIBZ+IVkORMP793XhATpQMNwZ5W/HStZj8PBaMERscturPSzKtqTWSQmJRfCrHXdRzykG72Smx8rcvmLD6qi33ckZPWfnx3LNBQy5fjXQyvykkgSW4cLoycdXmzyS7H0qn8QR+KKKFL6mP7NDT0hicrwXVmkVDQsfPcQTbFcUOzyIQAAAIA4AlORbqA4kM3flwr+QdnkpFdEMgctEhCBl+v7pID7KLxEFO6Equ/T/0ACrXLULscUe6mXnQfDDyGJllizu4KO85xZSmPrKm2z6jD+Ty9CcnXcpGiKXmDE44aQQk1XIP9PT51wKuOZpP630eBZzhRmU6qLaf/tPI+UAZPqElTHOw==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC0YDHvFYKU0+pQ5IN5p9mUmdtS6cBTrB+ZKy0l+bjoQr+r6dFQ4q9w4cNw2qHqNVCfFdP4/vjJj9vZGg/TxfqAZCVv0/3RxEac83b+pecwnoOP+7+YSXUbHOPUf8RHKxwwCX+Bdw0CRSlZf7toMumnl8O9Ybc8O8mk/Zn/9trrMHU8xeqTQEkbQ2nPWsIK9ZRMVC/DLrYI7l5Ukh4k5AiT/6zWfrMmGhmbS5ShkCVvW/9ONR86pW6zjXBwnW8Pp8Z/n0WTX/uaCYmaeOrLWNpHmk0kFnonS22YrCjxeXT6MPG8cRBNVpVYNTkQ8EXSLM6c3Wpbei5+TgjxJgphMupYZcsDpn1TxYV8s8tJ1LQJyusuSGUqsXMqBPojCAWGCA8ighXUQu1H8iosDKRLuj17T8gzjHIreMeUrYrFSaAzTyuVeW0JhJv7RgY4125G/avxHRuqrLcVQg5dasqCsaOqukceOl+qfcOg2XTYp0QcfLwUAhCeoCWi9hvvvPN5oG8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCf3Oywqk5azQ+c4TBuAx4smbzyBAh4Jm6QjjFrxlx9akYq+rKyq+QL7Rutx8e/D/6YVUySk7xc4UFfugnqr3yA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICNZHqIVpra9FzpEhA1aDGDQvP13owEjUJoCoCNWZGKw", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2715, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 824, "free": 2715}, "nocache": {"free": 3302, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e349d-5e19-da61-d6e5-27661f684410", "ansible_product_uuid": "ec2e349d-5e19-da61-d6e5-27661f684410", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 197, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481683968, "block_size": 4096, "block_total": 65533179, "block_available": 64326583, "block_used": 1206596, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.10.210 43894 10.31.14.221 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.10.210 43894 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:e6:b4:51", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.221", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fee6:b451", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.221", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:e6:b4:51", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.221"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fee6:b451"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.221", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fee6:b451"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "16", "minute": "10", "second": "14", "epoch": "1726776614", "epoch_int": "1726776614", "date": "2024-09-19", "time": "16:10:14", "iso8601_micro": "2024-09-19T20:10:14.256205Z", "iso8601": "2024-09-19T20:10:14Z", "iso8601_basic": "20240919T161014256205", "iso8601_basic_short": "20240919T161014", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.67, "5m": 0.42, "15m": 0.18}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.14.221 closed. 8189 1726776614.28720: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8189 1726776614.28740: _low_level_execute_command(): starting 8189 1726776614.28746: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776612.3874557-8189-179651865801312/ > /dev/null 2>&1 && sleep 0' 8189 1726776614.31174: stderr chunk (state=2): >>><<< 8189 1726776614.31184: stdout chunk (state=2): >>><<< 8189 1726776614.31200: _low_level_execute_command() done: rc=0, stdout=, stderr= 8189 1726776614.31208: handler run complete 8189 1726776614.31280: variable 'ansible_facts' from source: unknown 8189 1726776614.31346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8189 1726776614.31513: variable 'ansible_facts' from source: unknown 8189 1726776614.31580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8189 1726776614.31653: attempt loop complete, returning result 8189 1726776614.31659: _execute() done 8189 1726776614.31663: dumping result to json 8189 1726776614.31682: done dumping result, returning 8189 1726776614.31689: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [120fa90a-8a95-f1be-6eb1-00000000000d] 8189 1726776614.31693: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000000d 8189 1726776614.31804: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000000d 8189 1726776614.31809: WORKER PROCESS EXITING ok: [managed_node1] 8186 1726776614.32243: no more pending results, returning what we have 8186 1726776614.32245: results queue empty 8186 1726776614.32245: checking for any_errors_fatal 8186 1726776614.32246: done checking for any_errors_fatal 8186 1726776614.32246: checking for max_fail_percentage 8186 1726776614.32247: done checking for max_fail_percentage 8186 1726776614.32248: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.32248: done checking to see if all hosts have failed 8186 1726776614.32248: getting the remaining hosts for this loop 8186 1726776614.32249: done getting the remaining hosts for this loop 8186 1726776614.32251: getting the next task for host managed_node1 8186 1726776614.32257: done getting next task for host managed_node1 8186 1726776614.32258: ^ task is: TASK: meta (flush_handlers) 8186 1726776614.32259: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.32262: getting variables 8186 1726776614.32262: in VariableManager get_vars() 8186 1726776614.32278: Calling all_inventory to load vars for managed_node1 8186 1726776614.32280: Calling groups_inventory to load vars for managed_node1 8186 1726776614.32282: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.32289: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.32290: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.32292: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.33885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.33990: done with get_vars() 8186 1726776614.33998: done getting variables 8186 1726776614.34027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8186 1726776614.34065: in VariableManager get_vars() 8186 1726776614.34071: Calling all_inventory to load vars for managed_node1 8186 1726776614.34073: Calling groups_inventory to load vars for managed_node1 8186 1726776614.34074: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.34077: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.34078: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.34079: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.34158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.34253: done with get_vars() 8186 1726776614.34263: done queuing things up, now waiting for results queue to drain 8186 1726776614.34264: results queue empty 8186 1726776614.34264: checking for any_errors_fatal 8186 1726776614.34266: done checking for any_errors_fatal 8186 1726776614.34267: checking for max_fail_percentage 8186 1726776614.34267: done checking for max_fail_percentage 8186 1726776614.34268: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.34268: done checking to see if all hosts have failed 8186 1726776614.34268: getting the remaining hosts for this loop 8186 1726776614.34269: done getting the remaining hosts for this loop 8186 1726776614.34270: getting the next task for host managed_node1 8186 1726776614.34273: done getting next task for host managed_node1 8186 1726776614.34274: ^ task is: TASK: Try to pass a boolean value for sysctl value 8186 1726776614.34274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.34276: getting variables 8186 1726776614.34276: in VariableManager get_vars() 8186 1726776614.34281: Calling all_inventory to load vars for managed_node1 8186 1726776614.34282: Calling groups_inventory to load vars for managed_node1 8186 1726776614.34283: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.34286: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.34287: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.34289: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.34365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.34478: done with get_vars() 8186 1726776614.34483: done getting variables TASK [Try to pass a boolean value for sysctl value] **************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:7 Thursday 19 September 2024 16:10:14 -0400 (0:00:02.209) 0:00:02.217 **** 8186 1726776614.34531: entering _queue_task() for managed_node1/include_role 8186 1726776614.34532: Creating lock for include_role 8186 1726776614.34720: worker is 1 (out of 1 available) 8186 1726776614.34735: exiting _queue_task() for managed_node1/include_role 8186 1726776614.34745: done queuing things up, now waiting for results queue to drain 8186 1726776614.34747: waiting for pending results... 8227 1726776614.34849: running TaskExecutor() for managed_node1/TASK: Try to pass a boolean value for sysctl value 8227 1726776614.34960: in run() - task 120fa90a-8a95-f1be-6eb1-000000000006 8227 1726776614.34976: variable 'ansible_search_path' from source: unknown 8227 1726776614.35009: calling self._execute() 8227 1726776614.35078: variable 'ansible_host' from source: host vars for 'managed_node1' 8227 1726776614.35090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8227 1726776614.35100: variable 'omit' from source: magic vars 8227 1726776614.35198: _execute() done 8227 1726776614.35205: dumping result to json 8227 1726776614.35212: done dumping result, returning 8227 1726776614.35217: done running TaskExecutor() for managed_node1/TASK: Try to pass a boolean value for sysctl value [120fa90a-8a95-f1be-6eb1-000000000006] 8227 1726776614.35225: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000006 8227 1726776614.35266: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000006 8227 1726776614.35270: WORKER PROCESS EXITING 8186 1726776614.35590: no more pending results, returning what we have 8186 1726776614.35593: in VariableManager get_vars() 8186 1726776614.35618: Calling all_inventory to load vars for managed_node1 8186 1726776614.35621: Calling groups_inventory to load vars for managed_node1 8186 1726776614.35626: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.35634: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.35636: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.35638: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.35747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.35857: done with get_vars() 8186 1726776614.35862: variable 'ansible_search_path' from source: unknown 8186 1726776614.35933: variable 'omit' from source: magic vars 8186 1726776614.35947: variable 'omit' from source: magic vars 8186 1726776614.35958: variable 'omit' from source: magic vars 8186 1726776614.35961: we have included files to process 8186 1726776614.35961: generating all_blocks data 8186 1726776614.35962: done generating all_blocks data 8186 1726776614.35962: processing included file: fedora.linux_system_roles.kernel_settings 8186 1726776614.35977: in VariableManager get_vars() 8186 1726776614.35985: done with get_vars() 8186 1726776614.36032: in VariableManager get_vars() 8186 1726776614.36041: done with get_vars() 8186 1726776614.36068: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8186 1726776614.36174: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8186 1726776614.36213: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8186 1726776614.36292: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8186 1726776614.36746: in VariableManager get_vars() 8186 1726776614.36762: done with get_vars() 8186 1726776614.37702: in VariableManager get_vars() 8186 1726776614.37716: done with get_vars() 8186 1726776614.37864: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8186 1726776614.38849: iterating over new_blocks loaded from include file 8186 1726776614.38851: in VariableManager get_vars() 8186 1726776614.38871: done with get_vars() 8186 1726776614.38873: filtering new block on tags 8186 1726776614.38892: done filtering new block on tags 8186 1726776614.38895: in VariableManager get_vars() 8186 1726776614.38909: done with get_vars() 8186 1726776614.38912: filtering new block on tags 8186 1726776614.38932: done filtering new block on tags 8186 1726776614.38934: in VariableManager get_vars() 8186 1726776614.38949: done with get_vars() 8186 1726776614.38950: filtering new block on tags 8186 1726776614.38993: done filtering new block on tags 8186 1726776614.38996: in VariableManager get_vars() 8186 1726776614.39010: done with get_vars() 8186 1726776614.39012: filtering new block on tags 8186 1726776614.39027: done filtering new block on tags 8186 1726776614.39031: done iterating over new_blocks loaded from include file 8186 1726776614.39032: extending task lists for all hosts with included blocks 8186 1726776614.39123: done extending task lists 8186 1726776614.39125: done processing included files 8186 1726776614.39125: results queue empty 8186 1726776614.39126: checking for any_errors_fatal 8186 1726776614.39127: done checking for any_errors_fatal 8186 1726776614.39128: checking for max_fail_percentage 8186 1726776614.39130: done checking for max_fail_percentage 8186 1726776614.39131: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.39131: done checking to see if all hosts have failed 8186 1726776614.39132: getting the remaining hosts for this loop 8186 1726776614.39133: done getting the remaining hosts for this loop 8186 1726776614.39135: getting the next task for host managed_node1 8186 1726776614.39139: done getting next task for host managed_node1 8186 1726776614.39141: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8186 1726776614.39143: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.39151: getting variables 8186 1726776614.39152: in VariableManager get_vars() 8186 1726776614.39168: Calling all_inventory to load vars for managed_node1 8186 1726776614.39171: Calling groups_inventory to load vars for managed_node1 8186 1726776614.39173: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.39177: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.39180: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.39182: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.39384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.39575: done with get_vars() 8186 1726776614.39584: done getting variables 8186 1726776614.39651: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.051) 0:00:02.268 **** 8186 1726776614.39680: entering _queue_task() for managed_node1/fail 8186 1726776614.39681: Creating lock for fail 8186 1726776614.39927: worker is 1 (out of 1 available) 8186 1726776614.39942: exiting _queue_task() for managed_node1/fail 8186 1726776614.39957: done queuing things up, now waiting for results queue to drain 8186 1726776614.39959: waiting for pending results... 8229 1726776614.40180: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8229 1726776614.40309: in run() - task 120fa90a-8a95-f1be-6eb1-00000000002c 8229 1726776614.40326: variable 'ansible_search_path' from source: unknown 8229 1726776614.40333: variable 'ansible_search_path' from source: unknown 8229 1726776614.40369: calling self._execute() 8229 1726776614.40437: variable 'ansible_host' from source: host vars for 'managed_node1' 8229 1726776614.40446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8229 1726776614.40457: variable 'omit' from source: magic vars 8229 1726776614.40873: variable 'kernel_settings_sysctl' from source: include params 8229 1726776614.40891: variable '__kernel_settings_state_empty' from source: role '' all vars 8229 1726776614.40900: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8229 1726776614.41204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8229 1726776614.43588: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8229 1726776614.43663: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8229 1726776614.43698: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8229 1726776614.43734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8229 1726776614.43764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8229 1726776614.43834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8229 1726776614.43867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8229 1726776614.43892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8229 1726776614.43933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8229 1726776614.43948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8229 1726776614.44001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8229 1726776614.44026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8229 1726776614.44052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8229 1726776614.44093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8229 1726776614.44107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8229 1726776614.44149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8229 1726776614.44174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8229 1726776614.44197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8229 1726776614.44236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8229 1726776614.44249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8229 1726776614.44538: variable 'kernel_settings_sysctl' from source: include params 8229 1726776614.44615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8229 1726776614.44774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8229 1726776614.44807: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8229 1726776614.44839: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8229 1726776614.44873: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8229 1726776614.45112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8229 1726776614.45138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8229 1726776614.45166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8229 1726776614.45191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8229 1726776614.45232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8229 1726776614.45253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8229 1726776614.45279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8229 1726776614.45304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8229 1726776614.45327: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): True 8229 1726776614.45337: variable 'omit' from source: magic vars 8229 1726776614.45379: variable 'omit' from source: magic vars 8229 1726776614.45409: variable 'omit' from source: magic vars 8229 1726776614.45434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8229 1726776614.45461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8229 1726776614.45477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8229 1726776614.45493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8229 1726776614.45503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8229 1726776614.45537: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8229 1726776614.45544: variable 'ansible_host' from source: host vars for 'managed_node1' 8229 1726776614.45548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8229 1726776614.45645: Set connection var ansible_shell_executable to /bin/sh 8229 1726776614.45657: Set connection var ansible_timeout to 10 8229 1726776614.45664: Set connection var ansible_module_compression to ZIP_DEFLATED 8229 1726776614.45668: Set connection var ansible_connection to ssh 8229 1726776614.45675: Set connection var ansible_pipelining to False 8229 1726776614.45680: Set connection var ansible_shell_type to sh 8229 1726776614.45700: variable 'ansible_shell_executable' from source: unknown 8229 1726776614.45705: variable 'ansible_connection' from source: unknown 8229 1726776614.45708: variable 'ansible_module_compression' from source: unknown 8229 1726776614.45712: variable 'ansible_shell_type' from source: unknown 8229 1726776614.45715: variable 'ansible_shell_executable' from source: unknown 8229 1726776614.45717: variable 'ansible_host' from source: host vars for 'managed_node1' 8229 1726776614.45721: variable 'ansible_pipelining' from source: unknown 8229 1726776614.45724: variable 'ansible_timeout' from source: unknown 8229 1726776614.45727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8229 1726776614.45811: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8229 1726776614.45823: variable 'omit' from source: magic vars 8229 1726776614.45830: starting attempt loop 8229 1726776614.45833: running the handler 8229 1726776614.45841: handler run complete 8229 1726776614.45870: attempt loop complete, returning result 8229 1726776614.45875: _execute() done 8229 1726776614.45879: dumping result to json 8229 1726776614.45882: done dumping result, returning 8229 1726776614.45889: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-f1be-6eb1-00000000002c] 8229 1726776614.45894: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000002c 8229 1726776614.45919: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000002c 8229 1726776614.45922: WORKER PROCESS EXITING 8186 1726776614.46191: marking managed_node1 as failed 8186 1726776614.46199: marking host managed_node1 failed, current state: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.46205: ^ failed state is now: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=5, fail_state=2, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.46207: getting the next task for host managed_node1 8186 1726776614.46212: done getting next task for host managed_node1 8186 1726776614.46214: ^ task is: TASK: Check for sysctl bool value error 8186 1726776614.46215: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False fatal: [managed_node1]: FAILED! => { "changed": false } MSG: Boolean values are not allowed for sysctl settings 8186 1726776614.46334: no more pending results, returning what we have 8186 1726776614.46336: results queue empty 8186 1726776614.46337: checking for any_errors_fatal 8186 1726776614.46341: done checking for any_errors_fatal 8186 1726776614.46341: checking for max_fail_percentage 8186 1726776614.46342: done checking for max_fail_percentage 8186 1726776614.46343: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.46343: done checking to see if all hosts have failed 8186 1726776614.46344: getting the remaining hosts for this loop 8186 1726776614.46345: done getting the remaining hosts for this loop 8186 1726776614.46347: getting the next task for host managed_node1 8186 1726776614.46350: done getting next task for host managed_node1 8186 1726776614.46352: ^ task is: TASK: Check for sysctl bool value error 8186 1726776614.46353: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776614.46360: getting variables 8186 1726776614.46361: in VariableManager get_vars() 8186 1726776614.46389: Calling all_inventory to load vars for managed_node1 8186 1726776614.46391: Calling groups_inventory to load vars for managed_node1 8186 1726776614.46393: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.46401: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.46403: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.46405: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.46573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.46760: done with get_vars() 8186 1726776614.46771: done getting variables 8186 1726776614.46860: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Check for sysctl bool value error] *************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:25 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.072) 0:00:02.340 **** 8186 1726776614.46887: entering _queue_task() for managed_node1/assert 8186 1726776614.46889: Creating lock for assert 8186 1726776614.47091: worker is 1 (out of 1 available) 8186 1726776614.47104: exiting _queue_task() for managed_node1/assert 8186 1726776614.47114: done queuing things up, now waiting for results queue to drain 8186 1726776614.47118: waiting for pending results... 8230 1726776614.47300: running TaskExecutor() for managed_node1/TASK: Check for sysctl bool value error 8230 1726776614.47401: in run() - task 120fa90a-8a95-f1be-6eb1-000000000008 8230 1726776614.47419: variable 'ansible_search_path' from source: unknown 8230 1726776614.47452: calling self._execute() 8230 1726776614.47518: variable 'ansible_host' from source: host vars for 'managed_node1' 8230 1726776614.47526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8230 1726776614.47537: variable 'omit' from source: magic vars 8230 1726776614.47627: variable 'omit' from source: magic vars 8230 1726776614.47659: variable 'omit' from source: magic vars 8230 1726776614.47690: variable 'omit' from source: magic vars 8230 1726776614.47726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8230 1726776614.47761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8230 1726776614.47783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8230 1726776614.47799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8230 1726776614.47812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8230 1726776614.47842: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8230 1726776614.47848: variable 'ansible_host' from source: host vars for 'managed_node1' 8230 1726776614.47853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8230 1726776614.47950: Set connection var ansible_shell_executable to /bin/sh 8230 1726776614.47961: Set connection var ansible_timeout to 10 8230 1726776614.47968: Set connection var ansible_module_compression to ZIP_DEFLATED 8230 1726776614.47971: Set connection var ansible_connection to ssh 8230 1726776614.47978: Set connection var ansible_pipelining to False 8230 1726776614.47983: Set connection var ansible_shell_type to sh 8230 1726776614.48002: variable 'ansible_shell_executable' from source: unknown 8230 1726776614.48006: variable 'ansible_connection' from source: unknown 8230 1726776614.48010: variable 'ansible_module_compression' from source: unknown 8230 1726776614.48013: variable 'ansible_shell_type' from source: unknown 8230 1726776614.48016: variable 'ansible_shell_executable' from source: unknown 8230 1726776614.48019: variable 'ansible_host' from source: host vars for 'managed_node1' 8230 1726776614.48022: variable 'ansible_pipelining' from source: unknown 8230 1726776614.48025: variable 'ansible_timeout' from source: unknown 8230 1726776614.48030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8230 1726776614.48150: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8230 1726776614.48167: variable 'omit' from source: magic vars 8230 1726776614.48173: starting attempt loop 8230 1726776614.48176: running the handler 8230 1726776614.48533: variable 'ansible_failed_result' from source: set_fact 8230 1726776614.48551: Evaluated conditional (ansible_failed_result.msg != 'UNREACH'): True 8230 1726776614.48561: handler run complete 8230 1726776614.48575: attempt loop complete, returning result 8230 1726776614.48579: _execute() done 8230 1726776614.48582: dumping result to json 8230 1726776614.48585: done dumping result, returning 8230 1726776614.48591: done running TaskExecutor() for managed_node1/TASK: Check for sysctl bool value error [120fa90a-8a95-f1be-6eb1-000000000008] 8230 1726776614.48596: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000008 8230 1726776614.48621: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000008 8230 1726776614.48624: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 8186 1726776614.48980: no more pending results, returning what we have 8186 1726776614.48983: results queue empty 8186 1726776614.48983: checking for any_errors_fatal 8186 1726776614.48989: done checking for any_errors_fatal 8186 1726776614.48990: checking for max_fail_percentage 8186 1726776614.48991: done checking for max_fail_percentage 8186 1726776614.48991: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.48992: done checking to see if all hosts have failed 8186 1726776614.48993: getting the remaining hosts for this loop 8186 1726776614.48994: done getting the remaining hosts for this loop 8186 1726776614.48997: getting the next task for host managed_node1 8186 1726776614.49004: done getting next task for host managed_node1 8186 1726776614.49007: ^ task is: TASK: Cleanup 8186 1726776614.49008: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? True, did start at task? False 8186 1726776614.49011: getting variables 8186 1726776614.49012: in VariableManager get_vars() 8186 1726776614.49044: Calling all_inventory to load vars for managed_node1 8186 1726776614.49047: Calling groups_inventory to load vars for managed_node1 8186 1726776614.49049: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.49060: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.49063: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.49066: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.49223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.49412: done with get_vars() 8186 1726776614.49422: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:30 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.026) 0:00:02.366 **** 8186 1726776614.49505: entering _queue_task() for managed_node1/include_tasks 8186 1726776614.49507: Creating lock for include_tasks 8186 1726776614.49700: worker is 1 (out of 1 available) 8186 1726776614.49713: exiting _queue_task() for managed_node1/include_tasks 8186 1726776614.49723: done queuing things up, now waiting for results queue to drain 8186 1726776614.49725: waiting for pending results... 8231 1726776614.49978: running TaskExecutor() for managed_node1/TASK: Cleanup 8231 1726776614.50081: in run() - task 120fa90a-8a95-f1be-6eb1-000000000009 8231 1726776614.50097: variable 'ansible_search_path' from source: unknown 8231 1726776614.50130: calling self._execute() 8231 1726776614.50195: variable 'ansible_host' from source: host vars for 'managed_node1' 8231 1726776614.50205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8231 1726776614.50214: variable 'omit' from source: magic vars 8231 1726776614.50308: _execute() done 8231 1726776614.50315: dumping result to json 8231 1726776614.50319: done dumping result, returning 8231 1726776614.50324: done running TaskExecutor() for managed_node1/TASK: Cleanup [120fa90a-8a95-f1be-6eb1-000000000009] 8231 1726776614.50333: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000009 8231 1726776614.50364: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000009 8231 1726776614.50368: WORKER PROCESS EXITING 8186 1726776614.50731: no more pending results, returning what we have 8186 1726776614.50735: in VariableManager get_vars() 8186 1726776614.50769: Calling all_inventory to load vars for managed_node1 8186 1726776614.50771: Calling groups_inventory to load vars for managed_node1 8186 1726776614.50774: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.50785: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.50787: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.50790: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.50985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.51169: done with get_vars() 8186 1726776614.51176: variable 'ansible_search_path' from source: unknown 8186 1726776614.51189: we have included files to process 8186 1726776614.51190: generating all_blocks data 8186 1726776614.51192: done generating all_blocks data 8186 1726776614.51198: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8186 1726776614.51199: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8186 1726776614.51201: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node1 8186 1726776614.52243: done processing included file 8186 1726776614.52245: iterating over new_blocks loaded from include file 8186 1726776614.52246: in VariableManager get_vars() 8186 1726776614.52262: done with get_vars() 8186 1726776614.52264: filtering new block on tags 8186 1726776614.52277: done filtering new block on tags 8186 1726776614.52279: in VariableManager get_vars() 8186 1726776614.52290: done with get_vars() 8186 1726776614.52292: filtering new block on tags 8186 1726776614.52312: done filtering new block on tags 8186 1726776614.52314: done iterating over new_blocks loaded from include file 8186 1726776614.52314: extending task lists for all hosts with included blocks 8186 1726776614.53448: done extending task lists 8186 1726776614.53449: done processing included files 8186 1726776614.53450: results queue empty 8186 1726776614.53450: checking for any_errors_fatal 8186 1726776614.53453: done checking for any_errors_fatal 8186 1726776614.53456: checking for max_fail_percentage 8186 1726776614.53457: done checking for max_fail_percentage 8186 1726776614.53458: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.53459: done checking to see if all hosts have failed 8186 1726776614.53459: getting the remaining hosts for this loop 8186 1726776614.53460: done getting the remaining hosts for this loop 8186 1726776614.53463: getting the next task for host managed_node1 8186 1726776614.53467: done getting next task for host managed_node1 8186 1726776614.53469: ^ task is: TASK: Show current tuned profile settings 8186 1726776614.53471: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.53473: getting variables 8186 1726776614.53474: in VariableManager get_vars() 8186 1726776614.53485: Calling all_inventory to load vars for managed_node1 8186 1726776614.53487: Calling groups_inventory to load vars for managed_node1 8186 1726776614.53489: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.53494: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.53496: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.53499: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.53644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.53815: done with get_vars() 8186 1726776614.53823: done getting variables 8186 1726776614.53891: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.044) 0:00:02.410 **** 8186 1726776614.53916: entering _queue_task() for managed_node1/command 8186 1726776614.53917: Creating lock for command 8186 1726776614.54218: worker is 1 (out of 1 available) 8186 1726776614.54230: exiting _queue_task() for managed_node1/command 8186 1726776614.54241: done queuing things up, now waiting for results queue to drain 8186 1726776614.54243: waiting for pending results... 8232 1726776614.54495: running TaskExecutor() for managed_node1/TASK: Show current tuned profile settings 8232 1726776614.54613: in run() - task 120fa90a-8a95-f1be-6eb1-000000000095 8232 1726776614.54632: variable 'ansible_search_path' from source: unknown 8232 1726776614.54638: variable 'ansible_search_path' from source: unknown 8232 1726776614.54669: calling self._execute() 8232 1726776614.54734: variable 'ansible_host' from source: host vars for 'managed_node1' 8232 1726776614.54745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8232 1726776614.54754: variable 'omit' from source: magic vars 8232 1726776614.54851: variable 'omit' from source: magic vars 8232 1726776614.54890: variable 'omit' from source: magic vars 8232 1726776614.55192: variable '__kernel_settings_profile_filename' from source: role '' exported vars 8232 1726776614.55270: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8232 1726776614.55475: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8232 1726776614.55563: done running TaskExecutor() for managed_node1/TASK: Show current tuned profile settings [120fa90a-8a95-f1be-6eb1-000000000095] 8232 1726776614.55574: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000095 8232 1726776614.55602: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000095 8232 1726776614.55605: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => {} MSG: The task includes an option with an undefined variable. The error was: {{ __kernel_settings_profile_dir }}/tuned.conf: {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_dir }}/tuned.conf: {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined The error appears to be in '/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml': line 2, column 3, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: --- - name: Show current tuned profile settings ^ here ...ignoring 8186 1726776614.55989: no more pending results, returning what we have 8186 1726776614.55992: results queue empty 8186 1726776614.55992: checking for any_errors_fatal 8186 1726776614.55997: done checking for any_errors_fatal 8186 1726776614.55997: checking for max_fail_percentage 8186 1726776614.55999: done checking for max_fail_percentage 8186 1726776614.55999: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.56000: done checking to see if all hosts have failed 8186 1726776614.56001: getting the remaining hosts for this loop 8186 1726776614.56002: done getting the remaining hosts for this loop 8186 1726776614.56005: getting the next task for host managed_node1 8186 1726776614.56011: done getting next task for host managed_node1 8186 1726776614.56014: ^ task is: TASK: Run role with purge to remove everything 8186 1726776614.56017: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.56021: getting variables 8186 1726776614.56022: in VariableManager get_vars() 8186 1726776614.56056: Calling all_inventory to load vars for managed_node1 8186 1726776614.56059: Calling groups_inventory to load vars for managed_node1 8186 1726776614.56062: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.56070: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.56073: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.56076: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.56236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.56428: done with get_vars() 8186 1726776614.56440: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.026) 0:00:02.437 **** 8186 1726776614.56524: entering _queue_task() for managed_node1/include_role 8186 1726776614.56718: worker is 1 (out of 1 available) 8186 1726776614.56731: exiting _queue_task() for managed_node1/include_role 8186 1726776614.56742: done queuing things up, now waiting for results queue to drain 8186 1726776614.56744: waiting for pending results... 8233 1726776614.56924: running TaskExecutor() for managed_node1/TASK: Run role with purge to remove everything 8233 1726776614.57042: in run() - task 120fa90a-8a95-f1be-6eb1-000000000097 8233 1726776614.57060: variable 'ansible_search_path' from source: unknown 8233 1726776614.57065: variable 'ansible_search_path' from source: unknown 8233 1726776614.57095: calling self._execute() 8233 1726776614.57175: variable 'ansible_host' from source: host vars for 'managed_node1' 8233 1726776614.57185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8233 1726776614.57194: variable 'omit' from source: magic vars 8233 1726776614.57296: _execute() done 8233 1726776614.57302: dumping result to json 8233 1726776614.57308: done dumping result, returning 8233 1726776614.57314: done running TaskExecutor() for managed_node1/TASK: Run role with purge to remove everything [120fa90a-8a95-f1be-6eb1-000000000097] 8233 1726776614.57323: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000097 8233 1726776614.57359: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000097 8233 1726776614.57363: WORKER PROCESS EXITING 8186 1726776614.57661: no more pending results, returning what we have 8186 1726776614.57665: in VariableManager get_vars() 8186 1726776614.57697: Calling all_inventory to load vars for managed_node1 8186 1726776614.57700: Calling groups_inventory to load vars for managed_node1 8186 1726776614.57702: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.57712: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.57715: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.57718: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.58078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.58266: done with get_vars() 8186 1726776614.58275: variable 'ansible_search_path' from source: unknown 8186 1726776614.58276: variable 'ansible_search_path' from source: unknown 8186 1726776614.58593: variable 'omit' from source: magic vars 8186 1726776614.58624: variable 'omit' from source: magic vars 8186 1726776614.58640: variable 'omit' from source: magic vars 8186 1726776614.58643: we have included files to process 8186 1726776614.58644: generating all_blocks data 8186 1726776614.58645: done generating all_blocks data 8186 1726776614.58647: processing included file: fedora.linux_system_roles.kernel_settings 8186 1726776614.58671: in VariableManager get_vars() 8186 1726776614.58684: done with get_vars() 8186 1726776614.58709: in VariableManager get_vars() 8186 1726776614.58725: done with get_vars() 8186 1726776614.58768: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8186 1726776614.58827: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8186 1726776614.58858: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8186 1726776614.58935: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8186 1726776614.59465: in VariableManager get_vars() 8186 1726776614.59484: done with get_vars() 8186 1726776614.60737: in VariableManager get_vars() 8186 1726776614.60760: done with get_vars() 8186 1726776614.60919: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8186 1726776614.61586: iterating over new_blocks loaded from include file 8186 1726776614.61588: in VariableManager get_vars() 8186 1726776614.61603: done with get_vars() 8186 1726776614.61605: filtering new block on tags 8186 1726776614.61623: done filtering new block on tags 8186 1726776614.61625: in VariableManager get_vars() 8186 1726776614.61670: done with get_vars() 8186 1726776614.61672: filtering new block on tags 8186 1726776614.61692: done filtering new block on tags 8186 1726776614.61694: in VariableManager get_vars() 8186 1726776614.61715: done with get_vars() 8186 1726776614.61716: filtering new block on tags 8186 1726776614.61775: done filtering new block on tags 8186 1726776614.61778: in VariableManager get_vars() 8186 1726776614.61794: done with get_vars() 8186 1726776614.61795: filtering new block on tags 8186 1726776614.61811: done filtering new block on tags 8186 1726776614.61812: done iterating over new_blocks loaded from include file 8186 1726776614.61813: extending task lists for all hosts with included blocks 8186 1726776614.62108: done extending task lists 8186 1726776614.62109: done processing included files 8186 1726776614.62110: results queue empty 8186 1726776614.62111: checking for any_errors_fatal 8186 1726776614.62113: done checking for any_errors_fatal 8186 1726776614.62114: checking for max_fail_percentage 8186 1726776614.62115: done checking for max_fail_percentage 8186 1726776614.62115: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.62116: done checking to see if all hosts have failed 8186 1726776614.62116: getting the remaining hosts for this loop 8186 1726776614.62117: done getting the remaining hosts for this loop 8186 1726776614.62120: getting the next task for host managed_node1 8186 1726776614.62124: done getting next task for host managed_node1 8186 1726776614.62126: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8186 1726776614.62131: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.62141: getting variables 8186 1726776614.62142: in VariableManager get_vars() 8186 1726776614.62154: Calling all_inventory to load vars for managed_node1 8186 1726776614.62158: Calling groups_inventory to load vars for managed_node1 8186 1726776614.62160: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.62164: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.62167: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.62169: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.62302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.62501: done with get_vars() 8186 1726776614.62509: done getting variables 8186 1726776614.62544: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.060) 0:00:02.497 **** 8186 1726776614.62579: entering _queue_task() for managed_node1/fail 8186 1726776614.62793: worker is 1 (out of 1 available) 8186 1726776614.62803: exiting _queue_task() for managed_node1/fail 8186 1726776614.62814: done queuing things up, now waiting for results queue to drain 8186 1726776614.62817: waiting for pending results... 8234 1726776614.63038: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8234 1726776614.63178: in run() - task 120fa90a-8a95-f1be-6eb1-00000000013d 8234 1726776614.63196: variable 'ansible_search_path' from source: unknown 8234 1726776614.63201: variable 'ansible_search_path' from source: unknown 8234 1726776614.63232: calling self._execute() 8234 1726776614.63297: variable 'ansible_host' from source: host vars for 'managed_node1' 8234 1726776614.63359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8234 1726776614.63369: variable 'omit' from source: magic vars 8234 1726776614.63790: variable 'kernel_settings_sysctl' from source: include params 8234 1726776614.63801: variable '__kernel_settings_state_empty' from source: role '' all vars 8234 1726776614.63812: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8234 1726776614.64182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8234 1726776614.66394: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8234 1726776614.66479: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8234 1726776614.66514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8234 1726776614.66550: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8234 1726776614.66578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8234 1726776614.66643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8234 1726776614.66672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8234 1726776614.66696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8234 1726776614.66737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8234 1726776614.66752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8234 1726776614.66806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8234 1726776614.66832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8234 1726776614.66859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8234 1726776614.66898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8234 1726776614.66913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8234 1726776614.66957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8234 1726776614.66981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8234 1726776614.67005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8234 1726776614.67045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8234 1726776614.67063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8234 1726776614.67400: variable 'kernel_settings_sysctl' from source: include params 8234 1726776614.67423: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 8234 1726776614.67428: when evaluation is False, skipping this task 8234 1726776614.67433: _execute() done 8234 1726776614.67436: dumping result to json 8234 1726776614.67439: done dumping result, returning 8234 1726776614.67444: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [120fa90a-8a95-f1be-6eb1-00000000013d] 8234 1726776614.67450: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013d 8234 1726776614.67475: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013d 8234 1726776614.67478: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8186 1726776614.67838: no more pending results, returning what we have 8186 1726776614.67841: results queue empty 8186 1726776614.67841: checking for any_errors_fatal 8186 1726776614.67843: done checking for any_errors_fatal 8186 1726776614.67844: checking for max_fail_percentage 8186 1726776614.67845: done checking for max_fail_percentage 8186 1726776614.67846: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.67846: done checking to see if all hosts have failed 8186 1726776614.67847: getting the remaining hosts for this loop 8186 1726776614.67848: done getting the remaining hosts for this loop 8186 1726776614.67851: getting the next task for host managed_node1 8186 1726776614.67859: done getting next task for host managed_node1 8186 1726776614.67863: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8186 1726776614.67866: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.67883: getting variables 8186 1726776614.67884: in VariableManager get_vars() 8186 1726776614.67957: Calling all_inventory to load vars for managed_node1 8186 1726776614.67960: Calling groups_inventory to load vars for managed_node1 8186 1726776614.67962: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.67970: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.67972: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.67975: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.68127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.68320: done with get_vars() 8186 1726776614.68331: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.058) 0:00:02.555 **** 8186 1726776614.68414: entering _queue_task() for managed_node1/include_tasks 8186 1726776614.68600: worker is 1 (out of 1 available) 8186 1726776614.68613: exiting _queue_task() for managed_node1/include_tasks 8186 1726776614.68624: done queuing things up, now waiting for results queue to drain 8186 1726776614.68626: waiting for pending results... 8235 1726776614.68909: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8235 1726776614.69036: in run() - task 120fa90a-8a95-f1be-6eb1-00000000013e 8235 1726776614.69052: variable 'ansible_search_path' from source: unknown 8235 1726776614.69059: variable 'ansible_search_path' from source: unknown 8235 1726776614.69088: calling self._execute() 8235 1726776614.69152: variable 'ansible_host' from source: host vars for 'managed_node1' 8235 1726776614.69164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8235 1726776614.69172: variable 'omit' from source: magic vars 8235 1726776614.69263: _execute() done 8235 1726776614.69269: dumping result to json 8235 1726776614.69272: done dumping result, returning 8235 1726776614.69277: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [120fa90a-8a95-f1be-6eb1-00000000013e] 8235 1726776614.69285: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013e 8235 1726776614.69309: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013e 8235 1726776614.69313: WORKER PROCESS EXITING 8186 1726776614.69594: no more pending results, returning what we have 8186 1726776614.69598: in VariableManager get_vars() 8186 1726776614.69631: Calling all_inventory to load vars for managed_node1 8186 1726776614.69633: Calling groups_inventory to load vars for managed_node1 8186 1726776614.69635: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.69644: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.69646: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.69649: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.69805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.69993: done with get_vars() 8186 1726776614.70000: variable 'ansible_search_path' from source: unknown 8186 1726776614.70001: variable 'ansible_search_path' from source: unknown 8186 1726776614.70034: we have included files to process 8186 1726776614.70035: generating all_blocks data 8186 1726776614.70036: done generating all_blocks data 8186 1726776614.70040: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8186 1726776614.70042: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8186 1726776614.70043: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node1 8186 1726776614.70741: done processing included file 8186 1726776614.70744: iterating over new_blocks loaded from include file 8186 1726776614.70745: in VariableManager get_vars() 8186 1726776614.70769: done with get_vars() 8186 1726776614.70770: filtering new block on tags 8186 1726776614.70786: done filtering new block on tags 8186 1726776614.70788: in VariableManager get_vars() 8186 1726776614.70809: done with get_vars() 8186 1726776614.70810: filtering new block on tags 8186 1726776614.70830: done filtering new block on tags 8186 1726776614.70832: in VariableManager get_vars() 8186 1726776614.70852: done with get_vars() 8186 1726776614.70853: filtering new block on tags 8186 1726776614.70873: done filtering new block on tags 8186 1726776614.70875: in VariableManager get_vars() 8186 1726776614.70895: done with get_vars() 8186 1726776614.70896: filtering new block on tags 8186 1726776614.70908: done filtering new block on tags 8186 1726776614.70911: done iterating over new_blocks loaded from include file 8186 1726776614.70912: extending task lists for all hosts with included blocks 8186 1726776614.71127: done extending task lists 8186 1726776614.71128: done processing included files 8186 1726776614.71130: results queue empty 8186 1726776614.71130: checking for any_errors_fatal 8186 1726776614.71133: done checking for any_errors_fatal 8186 1726776614.71133: checking for max_fail_percentage 8186 1726776614.71134: done checking for max_fail_percentage 8186 1726776614.71134: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.71135: done checking to see if all hosts have failed 8186 1726776614.71135: getting the remaining hosts for this loop 8186 1726776614.71135: done getting the remaining hosts for this loop 8186 1726776614.71137: getting the next task for host managed_node1 8186 1726776614.71141: done getting next task for host managed_node1 8186 1726776614.71144: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8186 1726776614.71147: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.71153: getting variables 8186 1726776614.71154: in VariableManager get_vars() 8186 1726776614.71164: Calling all_inventory to load vars for managed_node1 8186 1726776614.71166: Calling groups_inventory to load vars for managed_node1 8186 1726776614.71167: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.71170: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.71171: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.71172: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.71266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.71381: done with get_vars() 8186 1726776614.71387: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.030) 0:00:02.586 **** 8186 1726776614.71434: entering _queue_task() for managed_node1/setup 8186 1726776614.71592: worker is 1 (out of 1 available) 8186 1726776614.71606: exiting _queue_task() for managed_node1/setup 8186 1726776614.71616: done queuing things up, now waiting for results queue to drain 8186 1726776614.71619: waiting for pending results... 8237 1726776614.71712: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8237 1726776614.71827: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001b9 8237 1726776614.71844: variable 'ansible_search_path' from source: unknown 8237 1726776614.71849: variable 'ansible_search_path' from source: unknown 8237 1726776614.71875: calling self._execute() 8237 1726776614.71925: variable 'ansible_host' from source: host vars for 'managed_node1' 8237 1726776614.71935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8237 1726776614.71943: variable 'omit' from source: magic vars 8237 1726776614.72286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8237 1726776614.73920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8237 1726776614.73969: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8237 1726776614.73996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8237 1726776614.74023: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8237 1726776614.74060: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8237 1726776614.74114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8237 1726776614.74137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8237 1726776614.74158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8237 1726776614.74187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8237 1726776614.74199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8237 1726776614.74238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8237 1726776614.74257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8237 1726776614.74275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8237 1726776614.74303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8237 1726776614.74314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8237 1726776614.74434: variable '__kernel_settings_required_facts' from source: role '' all vars 8237 1726776614.74447: variable 'ansible_facts' from source: unknown 8237 1726776614.74534: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8237 1726776614.74540: when evaluation is False, skipping this task 8237 1726776614.74544: _execute() done 8237 1726776614.74548: dumping result to json 8237 1726776614.74551: done dumping result, returning 8237 1726776614.74560: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [120fa90a-8a95-f1be-6eb1-0000000001b9] 8237 1726776614.74567: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001b9 8237 1726776614.74590: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001b9 8237 1726776614.74593: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8186 1726776614.74973: no more pending results, returning what we have 8186 1726776614.74976: results queue empty 8186 1726776614.74976: checking for any_errors_fatal 8186 1726776614.74977: done checking for any_errors_fatal 8186 1726776614.74978: checking for max_fail_percentage 8186 1726776614.74979: done checking for max_fail_percentage 8186 1726776614.74979: checking to see if all hosts have failed and the running result is not ok 8186 1726776614.74980: done checking to see if all hosts have failed 8186 1726776614.74980: getting the remaining hosts for this loop 8186 1726776614.74981: done getting the remaining hosts for this loop 8186 1726776614.74983: getting the next task for host managed_node1 8186 1726776614.74989: done getting next task for host managed_node1 8186 1726776614.74991: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8186 1726776614.74994: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776614.75008: getting variables 8186 1726776614.75009: in VariableManager get_vars() 8186 1726776614.75036: Calling all_inventory to load vars for managed_node1 8186 1726776614.75038: Calling groups_inventory to load vars for managed_node1 8186 1726776614.75040: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776614.75049: Calling all_plugins_play to load vars for managed_node1 8186 1726776614.75051: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776614.75053: Calling groups_plugins_play to load vars for managed_node1 8186 1726776614.75155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776614.75293: done with get_vars() 8186 1726776614.75301: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 16:10:14 -0400 (0:00:00.039) 0:00:02.625 **** 8186 1726776614.75374: entering _queue_task() for managed_node1/stat 8186 1726776614.75532: worker is 1 (out of 1 available) 8186 1726776614.75546: exiting _queue_task() for managed_node1/stat 8186 1726776614.75556: done queuing things up, now waiting for results queue to drain 8186 1726776614.75558: waiting for pending results... 8239 1726776614.75664: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8239 1726776614.75779: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001bb 8239 1726776614.75795: variable 'ansible_search_path' from source: unknown 8239 1726776614.75799: variable 'ansible_search_path' from source: unknown 8239 1726776614.75825: calling self._execute() 8239 1726776614.75878: variable 'ansible_host' from source: host vars for 'managed_node1' 8239 1726776614.75934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8239 1726776614.75943: variable 'omit' from source: magic vars 8239 1726776614.76272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8239 1726776614.76441: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8239 1726776614.76474: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8239 1726776614.76499: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8239 1726776614.76527: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8239 1726776614.76589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8239 1726776614.76609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8239 1726776614.76630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8239 1726776614.76650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8239 1726776614.76736: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8239 1726776614.76744: variable 'omit' from source: magic vars 8239 1726776614.76789: variable 'omit' from source: magic vars 8239 1726776614.76810: variable 'omit' from source: magic vars 8239 1726776614.76832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8239 1726776614.76852: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8239 1726776614.76871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8239 1726776614.76885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8239 1726776614.76895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8239 1726776614.76917: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8239 1726776614.76922: variable 'ansible_host' from source: host vars for 'managed_node1' 8239 1726776614.76927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8239 1726776614.76997: Set connection var ansible_shell_executable to /bin/sh 8239 1726776614.77005: Set connection var ansible_timeout to 10 8239 1726776614.77012: Set connection var ansible_module_compression to ZIP_DEFLATED 8239 1726776614.77015: Set connection var ansible_connection to ssh 8239 1726776614.77023: Set connection var ansible_pipelining to False 8239 1726776614.77029: Set connection var ansible_shell_type to sh 8239 1726776614.77045: variable 'ansible_shell_executable' from source: unknown 8239 1726776614.77049: variable 'ansible_connection' from source: unknown 8239 1726776614.77052: variable 'ansible_module_compression' from source: unknown 8239 1726776614.77058: variable 'ansible_shell_type' from source: unknown 8239 1726776614.77061: variable 'ansible_shell_executable' from source: unknown 8239 1726776614.77064: variable 'ansible_host' from source: host vars for 'managed_node1' 8239 1726776614.77069: variable 'ansible_pipelining' from source: unknown 8239 1726776614.77072: variable 'ansible_timeout' from source: unknown 8239 1726776614.77076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8239 1726776614.77177: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8239 1726776614.77189: variable 'omit' from source: magic vars 8239 1726776614.77194: starting attempt loop 8239 1726776614.77197: running the handler 8239 1726776614.77209: _low_level_execute_command(): starting 8239 1726776614.77217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8239 1726776614.79541: stdout chunk (state=2): >>>/root <<< 8239 1726776614.79662: stderr chunk (state=3): >>><<< 8239 1726776614.79669: stdout chunk (state=3): >>><<< 8239 1726776614.79686: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8239 1726776614.79698: _low_level_execute_command(): starting 8239 1726776614.79703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861 `" && echo ansible-tmp-1726776614.7969341-8239-22518546244861="` echo /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861 `" ) && sleep 0' 8239 1726776614.82150: stdout chunk (state=2): >>>ansible-tmp-1726776614.7969341-8239-22518546244861=/root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861 <<< 8239 1726776614.82274: stderr chunk (state=3): >>><<< 8239 1726776614.82282: stdout chunk (state=3): >>><<< 8239 1726776614.82297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776614.7969341-8239-22518546244861=/root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861 , stderr= 8239 1726776614.82336: variable 'ansible_module_compression' from source: unknown 8239 1726776614.82379: ANSIBALLZ: Using lock for stat 8239 1726776614.82383: ANSIBALLZ: Acquiring lock 8239 1726776614.82387: ANSIBALLZ: Lock acquired: 140184657596864 8239 1726776614.82391: ANSIBALLZ: Creating module 8239 1726776614.92586: ANSIBALLZ: Writing module into payload 8239 1726776614.92673: ANSIBALLZ: Writing module 8239 1726776614.92695: ANSIBALLZ: Renaming module 8239 1726776614.92702: ANSIBALLZ: Done creating module 8239 1726776614.92719: variable 'ansible_facts' from source: unknown 8239 1726776614.92776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/AnsiballZ_stat.py 8239 1726776614.92879: Sending initial data 8239 1726776614.92886: Sent initial data (150 bytes) 8239 1726776614.95529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmps6rzjgq1 /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/AnsiballZ_stat.py <<< 8239 1726776614.96590: stderr chunk (state=3): >>><<< 8239 1726776614.96600: stdout chunk (state=3): >>><<< 8239 1726776614.96620: done transferring module to remote 8239 1726776614.96632: _low_level_execute_command(): starting 8239 1726776614.96637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/ /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/AnsiballZ_stat.py && sleep 0' 8239 1726776614.99034: stderr chunk (state=2): >>><<< 8239 1726776614.99045: stdout chunk (state=2): >>><<< 8239 1726776614.99062: _low_level_execute_command() done: rc=0, stdout=, stderr= 8239 1726776614.99068: _low_level_execute_command(): starting 8239 1726776614.99073: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/AnsiballZ_stat.py && sleep 0' 8239 1726776615.14145: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8239 1726776615.15252: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8239 1726776615.15301: stderr chunk (state=3): >>><<< 8239 1726776615.15309: stdout chunk (state=3): >>><<< 8239 1726776615.15326: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.14.221 closed. 8239 1726776615.15351: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8239 1726776615.15363: _low_level_execute_command(): starting 8239 1726776615.15369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776614.7969341-8239-22518546244861/ > /dev/null 2>&1 && sleep 0' 8239 1726776615.17857: stderr chunk (state=2): >>><<< 8239 1726776615.17866: stdout chunk (state=2): >>><<< 8239 1726776615.17879: _low_level_execute_command() done: rc=0, stdout=, stderr= 8239 1726776615.17886: handler run complete 8239 1726776615.17901: attempt loop complete, returning result 8239 1726776615.17905: _execute() done 8239 1726776615.17908: dumping result to json 8239 1726776615.17912: done dumping result, returning 8239 1726776615.17919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [120fa90a-8a95-f1be-6eb1-0000000001bb] 8239 1726776615.17925: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bb 8239 1726776615.17955: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bb 8239 1726776615.17960: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 8186 1726776615.18153: no more pending results, returning what we have 8186 1726776615.18159: results queue empty 8186 1726776615.18159: checking for any_errors_fatal 8186 1726776615.18163: done checking for any_errors_fatal 8186 1726776615.18164: checking for max_fail_percentage 8186 1726776615.18165: done checking for max_fail_percentage 8186 1726776615.18166: checking to see if all hosts have failed and the running result is not ok 8186 1726776615.18166: done checking to see if all hosts have failed 8186 1726776615.18167: getting the remaining hosts for this loop 8186 1726776615.18168: done getting the remaining hosts for this loop 8186 1726776615.18171: getting the next task for host managed_node1 8186 1726776615.18176: done getting next task for host managed_node1 8186 1726776615.18179: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8186 1726776615.18183: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776615.18191: getting variables 8186 1726776615.18193: in VariableManager get_vars() 8186 1726776615.18217: Calling all_inventory to load vars for managed_node1 8186 1726776615.18219: Calling groups_inventory to load vars for managed_node1 8186 1726776615.18221: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776615.18231: Calling all_plugins_play to load vars for managed_node1 8186 1726776615.18233: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776615.18235: Calling groups_plugins_play to load vars for managed_node1 8186 1726776615.18335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776615.18451: done with get_vars() 8186 1726776615.18461: done getting variables 8186 1726776615.18527: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 16:10:15 -0400 (0:00:00.431) 0:00:03.057 **** 8186 1726776615.18553: entering _queue_task() for managed_node1/set_fact 8186 1726776615.18556: Creating lock for set_fact 8186 1726776615.18724: worker is 1 (out of 1 available) 8186 1726776615.18740: exiting _queue_task() for managed_node1/set_fact 8186 1726776615.18752: done queuing things up, now waiting for results queue to drain 8186 1726776615.18757: waiting for pending results... 8252 1726776615.18858: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8252 1726776615.18977: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001bc 8252 1726776615.18994: variable 'ansible_search_path' from source: unknown 8252 1726776615.18997: variable 'ansible_search_path' from source: unknown 8252 1726776615.19026: calling self._execute() 8252 1726776615.19081: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726776615.19090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726776615.19099: variable 'omit' from source: magic vars 8252 1726776615.19411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8252 1726776615.19605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8252 1726776615.19640: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8252 1726776615.19667: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8252 1726776615.19694: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8252 1726776615.19756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8252 1726776615.19776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8252 1726776615.19794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8252 1726776615.19812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8252 1726776615.19901: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8252 1726776615.19909: variable 'omit' from source: magic vars 8252 1726776615.19952: variable 'omit' from source: magic vars 8252 1726776615.20032: variable '__ostree_booted_stat' from source: set_fact 8252 1726776615.20069: variable 'omit' from source: magic vars 8252 1726776615.20090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8252 1726776615.20110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8252 1726776615.20134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8252 1726776615.20148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8252 1726776615.20158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8252 1726776615.20181: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8252 1726776615.20186: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726776615.20190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726776615.20254: Set connection var ansible_shell_executable to /bin/sh 8252 1726776615.20262: Set connection var ansible_timeout to 10 8252 1726776615.20268: Set connection var ansible_module_compression to ZIP_DEFLATED 8252 1726776615.20272: Set connection var ansible_connection to ssh 8252 1726776615.20278: Set connection var ansible_pipelining to False 8252 1726776615.20283: Set connection var ansible_shell_type to sh 8252 1726776615.20298: variable 'ansible_shell_executable' from source: unknown 8252 1726776615.20301: variable 'ansible_connection' from source: unknown 8252 1726776615.20305: variable 'ansible_module_compression' from source: unknown 8252 1726776615.20308: variable 'ansible_shell_type' from source: unknown 8252 1726776615.20311: variable 'ansible_shell_executable' from source: unknown 8252 1726776615.20314: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726776615.20318: variable 'ansible_pipelining' from source: unknown 8252 1726776615.20322: variable 'ansible_timeout' from source: unknown 8252 1726776615.20326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726776615.20385: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8252 1726776615.20396: variable 'omit' from source: magic vars 8252 1726776615.20402: starting attempt loop 8252 1726776615.20405: running the handler 8252 1726776615.20414: handler run complete 8252 1726776615.20423: attempt loop complete, returning result 8252 1726776615.20426: _execute() done 8252 1726776615.20430: dumping result to json 8252 1726776615.20434: done dumping result, returning 8252 1726776615.20441: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [120fa90a-8a95-f1be-6eb1-0000000001bc] 8252 1726776615.20446: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bc 8252 1726776615.20466: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bc 8252 1726776615.20469: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8186 1726776615.20585: no more pending results, returning what we have 8186 1726776615.20587: results queue empty 8186 1726776615.20588: checking for any_errors_fatal 8186 1726776615.20592: done checking for any_errors_fatal 8186 1726776615.20592: checking for max_fail_percentage 8186 1726776615.20593: done checking for max_fail_percentage 8186 1726776615.20594: checking to see if all hosts have failed and the running result is not ok 8186 1726776615.20595: done checking to see if all hosts have failed 8186 1726776615.20595: getting the remaining hosts for this loop 8186 1726776615.20597: done getting the remaining hosts for this loop 8186 1726776615.20599: getting the next task for host managed_node1 8186 1726776615.20607: done getting next task for host managed_node1 8186 1726776615.20610: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8186 1726776615.20613: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776615.20627: getting variables 8186 1726776615.20630: in VariableManager get_vars() 8186 1726776615.20661: Calling all_inventory to load vars for managed_node1 8186 1726776615.20663: Calling groups_inventory to load vars for managed_node1 8186 1726776615.20664: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776615.20670: Calling all_plugins_play to load vars for managed_node1 8186 1726776615.20672: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776615.20674: Calling groups_plugins_play to load vars for managed_node1 8186 1726776615.20771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776615.20911: done with get_vars() 8186 1726776615.20917: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 16:10:15 -0400 (0:00:00.024) 0:00:03.081 **** 8186 1726776615.20983: entering _queue_task() for managed_node1/stat 8186 1726776615.21131: worker is 1 (out of 1 available) 8186 1726776615.21144: exiting _queue_task() for managed_node1/stat 8186 1726776615.21157: done queuing things up, now waiting for results queue to drain 8186 1726776615.21160: waiting for pending results... 8253 1726776615.21259: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8253 1726776615.21375: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001be 8253 1726776615.21391: variable 'ansible_search_path' from source: unknown 8253 1726776615.21395: variable 'ansible_search_path' from source: unknown 8253 1726776615.21419: calling self._execute() 8253 1726776615.21470: variable 'ansible_host' from source: host vars for 'managed_node1' 8253 1726776615.21479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8253 1726776615.21487: variable 'omit' from source: magic vars 8253 1726776615.21797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8253 1726776615.21970: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8253 1726776615.22004: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8253 1726776615.22034: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8253 1726776615.22061: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8253 1726776615.22119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8253 1726776615.22141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8253 1726776615.22161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8253 1726776615.22179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8253 1726776615.22268: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8253 1726776615.22277: variable 'omit' from source: magic vars 8253 1726776615.22322: variable 'omit' from source: magic vars 8253 1726776615.22346: variable 'omit' from source: magic vars 8253 1726776615.22367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8253 1726776615.22387: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8253 1726776615.22403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8253 1726776615.22416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8253 1726776615.22425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8253 1726776615.22450: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8253 1726776615.22455: variable 'ansible_host' from source: host vars for 'managed_node1' 8253 1726776615.22460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8253 1726776615.22524: Set connection var ansible_shell_executable to /bin/sh 8253 1726776615.22535: Set connection var ansible_timeout to 10 8253 1726776615.22542: Set connection var ansible_module_compression to ZIP_DEFLATED 8253 1726776615.22545: Set connection var ansible_connection to ssh 8253 1726776615.22551: Set connection var ansible_pipelining to False 8253 1726776615.22556: Set connection var ansible_shell_type to sh 8253 1726776615.22571: variable 'ansible_shell_executable' from source: unknown 8253 1726776615.22575: variable 'ansible_connection' from source: unknown 8253 1726776615.22578: variable 'ansible_module_compression' from source: unknown 8253 1726776615.22581: variable 'ansible_shell_type' from source: unknown 8253 1726776615.22585: variable 'ansible_shell_executable' from source: unknown 8253 1726776615.22588: variable 'ansible_host' from source: host vars for 'managed_node1' 8253 1726776615.22592: variable 'ansible_pipelining' from source: unknown 8253 1726776615.22596: variable 'ansible_timeout' from source: unknown 8253 1726776615.22600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8253 1726776615.22692: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8253 1726776615.22702: variable 'omit' from source: magic vars 8253 1726776615.22707: starting attempt loop 8253 1726776615.22711: running the handler 8253 1726776615.22722: _low_level_execute_command(): starting 8253 1726776615.22732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8253 1726776615.25070: stdout chunk (state=2): >>>/root <<< 8253 1726776615.25193: stderr chunk (state=3): >>><<< 8253 1726776615.25200: stdout chunk (state=3): >>><<< 8253 1726776615.25221: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8253 1726776615.25235: _low_level_execute_command(): starting 8253 1726776615.25241: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369 `" && echo ansible-tmp-1726776615.2522995-8253-212109925346369="` echo /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369 `" ) && sleep 0' 8253 1726776615.27717: stdout chunk (state=2): >>>ansible-tmp-1726776615.2522995-8253-212109925346369=/root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369 <<< 8253 1726776615.27848: stderr chunk (state=3): >>><<< 8253 1726776615.27858: stdout chunk (state=3): >>><<< 8253 1726776615.27874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776615.2522995-8253-212109925346369=/root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369 , stderr= 8253 1726776615.27911: variable 'ansible_module_compression' from source: unknown 8253 1726776615.27960: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8253 1726776615.27990: variable 'ansible_facts' from source: unknown 8253 1726776615.28061: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/AnsiballZ_stat.py 8253 1726776615.28164: Sending initial data 8253 1726776615.28171: Sent initial data (151 bytes) 8253 1726776615.30683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpchi715hm /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/AnsiballZ_stat.py <<< 8253 1726776615.31760: stderr chunk (state=3): >>><<< 8253 1726776615.31772: stdout chunk (state=3): >>><<< 8253 1726776615.31793: done transferring module to remote 8253 1726776615.31804: _low_level_execute_command(): starting 8253 1726776615.31810: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/ /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/AnsiballZ_stat.py && sleep 0' 8253 1726776615.34188: stderr chunk (state=2): >>><<< 8253 1726776615.34200: stdout chunk (state=2): >>><<< 8253 1726776615.34216: _low_level_execute_command() done: rc=0, stdout=, stderr= 8253 1726776615.34220: _low_level_execute_command(): starting 8253 1726776615.34228: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/AnsiballZ_stat.py && sleep 0' 8253 1726776615.49207: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8253 1726776615.50275: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8253 1726776615.50325: stderr chunk (state=3): >>><<< 8253 1726776615.50334: stdout chunk (state=3): >>><<< 8253 1726776615.50351: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.14.221 closed. 8253 1726776615.50402: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8253 1726776615.50416: _low_level_execute_command(): starting 8253 1726776615.50422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776615.2522995-8253-212109925346369/ > /dev/null 2>&1 && sleep 0' 8253 1726776615.52871: stderr chunk (state=2): >>><<< 8253 1726776615.52880: stdout chunk (state=2): >>><<< 8253 1726776615.52897: _low_level_execute_command() done: rc=0, stdout=, stderr= 8253 1726776615.52905: handler run complete 8253 1726776615.52921: attempt loop complete, returning result 8253 1726776615.52925: _execute() done 8253 1726776615.52930: dumping result to json 8253 1726776615.52935: done dumping result, returning 8253 1726776615.52942: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [120fa90a-8a95-f1be-6eb1-0000000001be] 8253 1726776615.52948: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001be 8253 1726776615.52976: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001be 8253 1726776615.52980: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 8186 1726776615.53111: no more pending results, returning what we have 8186 1726776615.53113: results queue empty 8186 1726776615.53114: checking for any_errors_fatal 8186 1726776615.53119: done checking for any_errors_fatal 8186 1726776615.53120: checking for max_fail_percentage 8186 1726776615.53121: done checking for max_fail_percentage 8186 1726776615.53122: checking to see if all hosts have failed and the running result is not ok 8186 1726776615.53123: done checking to see if all hosts have failed 8186 1726776615.53123: getting the remaining hosts for this loop 8186 1726776615.53124: done getting the remaining hosts for this loop 8186 1726776615.53127: getting the next task for host managed_node1 8186 1726776615.53135: done getting next task for host managed_node1 8186 1726776615.53137: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8186 1726776615.53142: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776615.53151: getting variables 8186 1726776615.53153: in VariableManager get_vars() 8186 1726776615.53185: Calling all_inventory to load vars for managed_node1 8186 1726776615.53188: Calling groups_inventory to load vars for managed_node1 8186 1726776615.53190: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776615.53198: Calling all_plugins_play to load vars for managed_node1 8186 1726776615.53201: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776615.53203: Calling groups_plugins_play to load vars for managed_node1 8186 1726776615.53316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776615.53438: done with get_vars() 8186 1726776615.53446: done getting variables 8186 1726776615.53487: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 16:10:15 -0400 (0:00:00.325) 0:00:03.406 **** 8186 1726776615.53510: entering _queue_task() for managed_node1/set_fact 8186 1726776615.53675: worker is 1 (out of 1 available) 8186 1726776615.53690: exiting _queue_task() for managed_node1/set_fact 8186 1726776615.53702: done queuing things up, now waiting for results queue to drain 8186 1726776615.53704: waiting for pending results... 8261 1726776615.53814: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8261 1726776615.53933: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001bf 8261 1726776615.53952: variable 'ansible_search_path' from source: unknown 8261 1726776615.53958: variable 'ansible_search_path' from source: unknown 8261 1726776615.53985: calling self._execute() 8261 1726776615.54040: variable 'ansible_host' from source: host vars for 'managed_node1' 8261 1726776615.54048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8261 1726776615.54059: variable 'omit' from source: magic vars 8261 1726776615.54385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8261 1726776615.54609: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8261 1726776615.54642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8261 1726776615.54670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8261 1726776615.54698: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8261 1726776615.54762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8261 1726776615.54782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8261 1726776615.54802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8261 1726776615.54821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8261 1726776615.54914: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8261 1726776615.54922: variable 'omit' from source: magic vars 8261 1726776615.54973: variable 'omit' from source: magic vars 8261 1726776615.55057: variable '__transactional_update_stat' from source: set_fact 8261 1726776615.55093: variable 'omit' from source: magic vars 8261 1726776615.55114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8261 1726776615.55135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8261 1726776615.55152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8261 1726776615.55168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8261 1726776615.55177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8261 1726776615.55200: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8261 1726776615.55205: variable 'ansible_host' from source: host vars for 'managed_node1' 8261 1726776615.55209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8261 1726776615.55281: Set connection var ansible_shell_executable to /bin/sh 8261 1726776615.55289: Set connection var ansible_timeout to 10 8261 1726776615.55295: Set connection var ansible_module_compression to ZIP_DEFLATED 8261 1726776615.55298: Set connection var ansible_connection to ssh 8261 1726776615.55305: Set connection var ansible_pipelining to False 8261 1726776615.55311: Set connection var ansible_shell_type to sh 8261 1726776615.55326: variable 'ansible_shell_executable' from source: unknown 8261 1726776615.55332: variable 'ansible_connection' from source: unknown 8261 1726776615.55334: variable 'ansible_module_compression' from source: unknown 8261 1726776615.55336: variable 'ansible_shell_type' from source: unknown 8261 1726776615.55338: variable 'ansible_shell_executable' from source: unknown 8261 1726776615.55339: variable 'ansible_host' from source: host vars for 'managed_node1' 8261 1726776615.55342: variable 'ansible_pipelining' from source: unknown 8261 1726776615.55343: variable 'ansible_timeout' from source: unknown 8261 1726776615.55345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8261 1726776615.55404: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8261 1726776615.55415: variable 'omit' from source: magic vars 8261 1726776615.55418: starting attempt loop 8261 1726776615.55420: running the handler 8261 1726776615.55427: handler run complete 8261 1726776615.55436: attempt loop complete, returning result 8261 1726776615.55438: _execute() done 8261 1726776615.55440: dumping result to json 8261 1726776615.55443: done dumping result, returning 8261 1726776615.55448: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [120fa90a-8a95-f1be-6eb1-0000000001bf] 8261 1726776615.55452: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bf 8261 1726776615.55471: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001bf 8261 1726776615.55474: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8186 1726776615.55745: no more pending results, returning what we have 8186 1726776615.55747: results queue empty 8186 1726776615.55748: checking for any_errors_fatal 8186 1726776615.55751: done checking for any_errors_fatal 8186 1726776615.55752: checking for max_fail_percentage 8186 1726776615.55753: done checking for max_fail_percentage 8186 1726776615.55753: checking to see if all hosts have failed and the running result is not ok 8186 1726776615.55754: done checking to see if all hosts have failed 8186 1726776615.55754: getting the remaining hosts for this loop 8186 1726776615.55756: done getting the remaining hosts for this loop 8186 1726776615.55758: getting the next task for host managed_node1 8186 1726776615.55764: done getting next task for host managed_node1 8186 1726776615.55766: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8186 1726776615.55769: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776615.55780: getting variables 8186 1726776615.55781: in VariableManager get_vars() 8186 1726776615.55807: Calling all_inventory to load vars for managed_node1 8186 1726776615.55809: Calling groups_inventory to load vars for managed_node1 8186 1726776615.55810: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776615.55817: Calling all_plugins_play to load vars for managed_node1 8186 1726776615.55818: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776615.55820: Calling groups_plugins_play to load vars for managed_node1 8186 1726776615.55954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776615.56074: done with get_vars() 8186 1726776615.56081: done getting variables 8186 1726776615.56166: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 16:10:15 -0400 (0:00:00.026) 0:00:03.433 **** 8186 1726776615.56189: entering _queue_task() for managed_node1/include_vars 8186 1726776615.56190: Creating lock for include_vars 8186 1726776615.56364: worker is 1 (out of 1 available) 8186 1726776615.56379: exiting _queue_task() for managed_node1/include_vars 8186 1726776615.56392: done queuing things up, now waiting for results queue to drain 8186 1726776615.56394: waiting for pending results... 8262 1726776615.56501: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8262 1726776615.56616: in run() - task 120fa90a-8a95-f1be-6eb1-0000000001c1 8262 1726776615.56635: variable 'ansible_search_path' from source: unknown 8262 1726776615.56639: variable 'ansible_search_path' from source: unknown 8262 1726776615.56669: calling self._execute() 8262 1726776615.56722: variable 'ansible_host' from source: host vars for 'managed_node1' 8262 1726776615.56732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8262 1726776615.56741: variable 'omit' from source: magic vars 8262 1726776615.56813: variable 'omit' from source: magic vars 8262 1726776615.56860: variable 'omit' from source: magic vars 8262 1726776615.57122: variable 'ffparams' from source: task vars 8262 1726776615.57219: variable 'ansible_facts' from source: unknown 8262 1726776615.57351: variable 'ansible_facts' from source: unknown 8262 1726776615.57443: variable 'ansible_facts' from source: unknown 8262 1726776615.57530: variable 'ansible_facts' from source: unknown 8262 1726776615.57605: variable 'role_path' from source: magic vars 8262 1726776615.57730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8262 1726776615.57890: Loaded config def from plugin (lookup/first_found) 8262 1726776615.57898: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 8262 1726776615.57926: variable 'ansible_search_path' from source: unknown 8262 1726776615.57947: variable 'ansible_search_path' from source: unknown 8262 1726776615.57956: variable 'ansible_search_path' from source: unknown 8262 1726776615.57964: variable 'ansible_search_path' from source: unknown 8262 1726776615.57970: variable 'ansible_search_path' from source: unknown 8262 1726776615.57983: variable 'omit' from source: magic vars 8262 1726776615.57998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8262 1726776615.58014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8262 1726776615.58032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8262 1726776615.58044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8262 1726776615.58054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8262 1726776615.58074: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8262 1726776615.58079: variable 'ansible_host' from source: host vars for 'managed_node1' 8262 1726776615.58081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8262 1726776615.58143: Set connection var ansible_shell_executable to /bin/sh 8262 1726776615.58149: Set connection var ansible_timeout to 10 8262 1726776615.58152: Set connection var ansible_module_compression to ZIP_DEFLATED 8262 1726776615.58154: Set connection var ansible_connection to ssh 8262 1726776615.58159: Set connection var ansible_pipelining to False 8262 1726776615.58162: Set connection var ansible_shell_type to sh 8262 1726776615.58175: variable 'ansible_shell_executable' from source: unknown 8262 1726776615.58177: variable 'ansible_connection' from source: unknown 8262 1726776615.58179: variable 'ansible_module_compression' from source: unknown 8262 1726776615.58181: variable 'ansible_shell_type' from source: unknown 8262 1726776615.58184: variable 'ansible_shell_executable' from source: unknown 8262 1726776615.58186: variable 'ansible_host' from source: host vars for 'managed_node1' 8262 1726776615.58188: variable 'ansible_pipelining' from source: unknown 8262 1726776615.58190: variable 'ansible_timeout' from source: unknown 8262 1726776615.58192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8262 1726776615.58269: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8262 1726776615.58281: variable 'omit' from source: magic vars 8262 1726776615.58287: starting attempt loop 8262 1726776615.58291: running the handler 8262 1726776615.58333: handler run complete 8262 1726776615.58341: attempt loop complete, returning result 8262 1726776615.58343: _execute() done 8262 1726776615.58346: dumping result to json 8262 1726776615.58348: done dumping result, returning 8262 1726776615.58352: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [120fa90a-8a95-f1be-6eb1-0000000001c1] 8262 1726776615.58357: sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001c1 8262 1726776615.58377: done sending task result for task 120fa90a-8a95-f1be-6eb1-0000000001c1 8262 1726776615.58380: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8186 1726776615.58575: no more pending results, returning what we have 8186 1726776615.58577: results queue empty 8186 1726776615.58578: checking for any_errors_fatal 8186 1726776615.58581: done checking for any_errors_fatal 8186 1726776615.58581: checking for max_fail_percentage 8186 1726776615.58582: done checking for max_fail_percentage 8186 1726776615.58582: checking to see if all hosts have failed and the running result is not ok 8186 1726776615.58583: done checking to see if all hosts have failed 8186 1726776615.58584: getting the remaining hosts for this loop 8186 1726776615.58584: done getting the remaining hosts for this loop 8186 1726776615.58586: getting the next task for host managed_node1 8186 1726776615.58592: done getting next task for host managed_node1 8186 1726776615.58595: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8186 1726776615.58597: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776615.58603: getting variables 8186 1726776615.58604: in VariableManager get_vars() 8186 1726776615.58626: Calling all_inventory to load vars for managed_node1 8186 1726776615.58630: Calling groups_inventory to load vars for managed_node1 8186 1726776615.58632: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776615.58642: Calling all_plugins_play to load vars for managed_node1 8186 1726776615.58644: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776615.58646: Calling groups_plugins_play to load vars for managed_node1 8186 1726776615.58745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776615.58863: done with get_vars() 8186 1726776615.58871: done getting variables 8186 1726776615.58940: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 16:10:15 -0400 (0:00:00.027) 0:00:03.461 **** 8186 1726776615.58966: entering _queue_task() for managed_node1/package 8186 1726776615.58967: Creating lock for package 8186 1726776615.59138: worker is 1 (out of 1 available) 8186 1726776615.59151: exiting _queue_task() for managed_node1/package 8186 1726776615.59165: done queuing things up, now waiting for results queue to drain 8186 1726776615.59167: waiting for pending results... 8263 1726776615.59268: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8263 1726776615.59369: in run() - task 120fa90a-8a95-f1be-6eb1-00000000013f 8263 1726776615.59385: variable 'ansible_search_path' from source: unknown 8263 1726776615.59390: variable 'ansible_search_path' from source: unknown 8263 1726776615.59414: calling self._execute() 8263 1726776615.59467: variable 'ansible_host' from source: host vars for 'managed_node1' 8263 1726776615.59474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8263 1726776615.59479: variable 'omit' from source: magic vars 8263 1726776615.59549: variable 'omit' from source: magic vars 8263 1726776615.59582: variable 'omit' from source: magic vars 8263 1726776615.59598: variable '__kernel_settings_packages' from source: include_vars 8263 1726776615.59856: variable '__kernel_settings_packages' from source: include_vars 8263 1726776615.60003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8263 1726776615.61631: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8263 1726776615.61678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8263 1726776615.61716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8263 1726776615.61743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8263 1726776615.61764: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8263 1726776615.61830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8263 1726776615.61851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8263 1726776615.61871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8263 1726776615.61898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8263 1726776615.61908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8263 1726776615.61981: variable '__kernel_settings_is_ostree' from source: set_fact 8263 1726776615.61989: variable 'omit' from source: magic vars 8263 1726776615.62008: variable 'omit' from source: magic vars 8263 1726776615.62026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8263 1726776615.62049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8263 1726776615.62064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8263 1726776615.62077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8263 1726776615.62086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8263 1726776615.62111: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8263 1726776615.62116: variable 'ansible_host' from source: host vars for 'managed_node1' 8263 1726776615.62120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8263 1726776615.62185: Set connection var ansible_shell_executable to /bin/sh 8263 1726776615.62193: Set connection var ansible_timeout to 10 8263 1726776615.62199: Set connection var ansible_module_compression to ZIP_DEFLATED 8263 1726776615.62203: Set connection var ansible_connection to ssh 8263 1726776615.62207: Set connection var ansible_pipelining to False 8263 1726776615.62212: Set connection var ansible_shell_type to sh 8263 1726776615.62227: variable 'ansible_shell_executable' from source: unknown 8263 1726776615.62231: variable 'ansible_connection' from source: unknown 8263 1726776615.62235: variable 'ansible_module_compression' from source: unknown 8263 1726776615.62238: variable 'ansible_shell_type' from source: unknown 8263 1726776615.62241: variable 'ansible_shell_executable' from source: unknown 8263 1726776615.62244: variable 'ansible_host' from source: host vars for 'managed_node1' 8263 1726776615.62248: variable 'ansible_pipelining' from source: unknown 8263 1726776615.62251: variable 'ansible_timeout' from source: unknown 8263 1726776615.62255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8263 1726776615.62313: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8263 1726776615.62324: variable 'omit' from source: magic vars 8263 1726776615.62331: starting attempt loop 8263 1726776615.62335: running the handler 8263 1726776615.62393: variable 'ansible_facts' from source: unknown 8263 1726776615.62472: _low_level_execute_command(): starting 8263 1726776615.62480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8263 1726776615.64796: stdout chunk (state=2): >>>/root <<< 8263 1726776615.64922: stderr chunk (state=3): >>><<< 8263 1726776615.64930: stdout chunk (state=3): >>><<< 8263 1726776615.64948: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8263 1726776615.64964: _low_level_execute_command(): starting 8263 1726776615.64971: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854 `" && echo ansible-tmp-1726776615.6495857-8263-118051301228854="` echo /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854 `" ) && sleep 0' 8263 1726776615.67465: stdout chunk (state=2): >>>ansible-tmp-1726776615.6495857-8263-118051301228854=/root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854 <<< 8263 1726776615.67595: stderr chunk (state=3): >>><<< 8263 1726776615.67602: stdout chunk (state=3): >>><<< 8263 1726776615.67616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776615.6495857-8263-118051301228854=/root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854 , stderr= 8263 1726776615.67643: variable 'ansible_module_compression' from source: unknown 8263 1726776615.67689: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8263 1726776615.67693: ANSIBALLZ: Acquiring lock 8263 1726776615.67697: ANSIBALLZ: Lock acquired: 140184657595568 8263 1726776615.67701: ANSIBALLZ: Creating module 8263 1726776615.80288: ANSIBALLZ: Writing module into payload 8263 1726776615.80507: ANSIBALLZ: Writing module 8263 1726776615.80533: ANSIBALLZ: Renaming module 8263 1726776615.80542: ANSIBALLZ: Done creating module 8263 1726776615.80562: variable 'ansible_facts' from source: unknown 8263 1726776615.80685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/AnsiballZ_dnf.py 8263 1726776615.80914: Sending initial data 8263 1726776615.80922: Sent initial data (150 bytes) 8263 1726776615.83944: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp_o_9g6as /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/AnsiballZ_dnf.py <<< 8263 1726776615.85986: stderr chunk (state=3): >>><<< 8263 1726776615.85997: stdout chunk (state=3): >>><<< 8263 1726776615.86020: done transferring module to remote 8263 1726776615.86034: _low_level_execute_command(): starting 8263 1726776615.86041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/ /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/AnsiballZ_dnf.py && sleep 0' 8263 1726776615.88614: stderr chunk (state=2): >>><<< 8263 1726776615.88625: stdout chunk (state=2): >>><<< 8263 1726776615.88643: _low_level_execute_command() done: rc=0, stdout=, stderr= 8263 1726776615.88649: _low_level_execute_command(): starting 8263 1726776615.88658: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/AnsiballZ_dnf.py && sleep 0' 8263 1726776621.36044: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8263 1726776621.44173: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8263 1726776621.44224: stderr chunk (state=3): >>><<< 8263 1726776621.44233: stdout chunk (state=3): >>><<< 8263 1726776621.44247: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8263 1726776621.44279: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8263 1726776621.44286: _low_level_execute_command(): starting 8263 1726776621.44290: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776615.6495857-8263-118051301228854/ > /dev/null 2>&1 && sleep 0' 8263 1726776621.46753: stderr chunk (state=2): >>><<< 8263 1726776621.46762: stdout chunk (state=2): >>><<< 8263 1726776621.46777: _low_level_execute_command() done: rc=0, stdout=, stderr= 8263 1726776621.46785: handler run complete 8263 1726776621.46811: attempt loop complete, returning result 8263 1726776621.46816: _execute() done 8263 1726776621.46819: dumping result to json 8263 1726776621.46826: done dumping result, returning 8263 1726776621.46834: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [120fa90a-8a95-f1be-6eb1-00000000013f] 8263 1726776621.46841: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013f 8263 1726776621.46870: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000013f 8263 1726776621.46874: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8186 1726776621.47036: no more pending results, returning what we have 8186 1726776621.47039: results queue empty 8186 1726776621.47040: checking for any_errors_fatal 8186 1726776621.47044: done checking for any_errors_fatal 8186 1726776621.47045: checking for max_fail_percentage 8186 1726776621.47046: done checking for max_fail_percentage 8186 1726776621.47047: checking to see if all hosts have failed and the running result is not ok 8186 1726776621.47048: done checking to see if all hosts have failed 8186 1726776621.47049: getting the remaining hosts for this loop 8186 1726776621.47050: done getting the remaining hosts for this loop 8186 1726776621.47054: getting the next task for host managed_node1 8186 1726776621.47061: done getting next task for host managed_node1 8186 1726776621.47067: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8186 1726776621.47069: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776621.47079: getting variables 8186 1726776621.47080: in VariableManager get_vars() 8186 1726776621.47110: Calling all_inventory to load vars for managed_node1 8186 1726776621.47113: Calling groups_inventory to load vars for managed_node1 8186 1726776621.47114: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776621.47123: Calling all_plugins_play to load vars for managed_node1 8186 1726776621.47125: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776621.47127: Calling groups_plugins_play to load vars for managed_node1 8186 1726776621.47287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776621.47405: done with get_vars() 8186 1726776621.47414: done getting variables 8186 1726776621.47485: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 16:10:21 -0400 (0:00:05.885) 0:00:09.346 **** 8186 1726776621.47510: entering _queue_task() for managed_node1/debug 8186 1726776621.47511: Creating lock for debug 8186 1726776621.47686: worker is 1 (out of 1 available) 8186 1726776621.47702: exiting _queue_task() for managed_node1/debug 8186 1726776621.47711: done queuing things up, now waiting for results queue to drain 8186 1726776621.47714: waiting for pending results... 8381 1726776621.47821: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8381 1726776621.47933: in run() - task 120fa90a-8a95-f1be-6eb1-000000000141 8381 1726776621.47950: variable 'ansible_search_path' from source: unknown 8381 1726776621.47954: variable 'ansible_search_path' from source: unknown 8381 1726776621.47983: calling self._execute() 8381 1726776621.48040: variable 'ansible_host' from source: host vars for 'managed_node1' 8381 1726776621.48048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8381 1726776621.48057: variable 'omit' from source: magic vars 8381 1726776621.48395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8381 1726776621.50142: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8381 1726776621.50191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8381 1726776621.50218: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8381 1726776621.50244: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8381 1726776621.50261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8381 1726776621.50315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8381 1726776621.50338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8381 1726776621.50360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8381 1726776621.50389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8381 1726776621.50400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8381 1726776621.50479: variable '__kernel_settings_is_transactional' from source: set_fact 8381 1726776621.50496: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8381 1726776621.50500: when evaluation is False, skipping this task 8381 1726776621.50504: _execute() done 8381 1726776621.50507: dumping result to json 8381 1726776621.50511: done dumping result, returning 8381 1726776621.50517: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [120fa90a-8a95-f1be-6eb1-000000000141] 8381 1726776621.50525: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000141 8381 1726776621.50547: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000141 8381 1726776621.50549: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8186 1726776621.50768: no more pending results, returning what we have 8186 1726776621.50771: results queue empty 8186 1726776621.50771: checking for any_errors_fatal 8186 1726776621.50776: done checking for any_errors_fatal 8186 1726776621.50777: checking for max_fail_percentage 8186 1726776621.50778: done checking for max_fail_percentage 8186 1726776621.50778: checking to see if all hosts have failed and the running result is not ok 8186 1726776621.50779: done checking to see if all hosts have failed 8186 1726776621.50779: getting the remaining hosts for this loop 8186 1726776621.50780: done getting the remaining hosts for this loop 8186 1726776621.50782: getting the next task for host managed_node1 8186 1726776621.50786: done getting next task for host managed_node1 8186 1726776621.50789: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8186 1726776621.50791: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776621.50801: getting variables 8186 1726776621.50802: in VariableManager get_vars() 8186 1726776621.50834: Calling all_inventory to load vars for managed_node1 8186 1726776621.50836: Calling groups_inventory to load vars for managed_node1 8186 1726776621.50837: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776621.50846: Calling all_plugins_play to load vars for managed_node1 8186 1726776621.50847: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776621.50849: Calling groups_plugins_play to load vars for managed_node1 8186 1726776621.50953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776621.51076: done with get_vars() 8186 1726776621.51085: done getting variables 8186 1726776621.51180: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 16:10:21 -0400 (0:00:00.036) 0:00:09.383 **** 8186 1726776621.51200: entering _queue_task() for managed_node1/reboot 8186 1726776621.51202: Creating lock for reboot 8186 1726776621.51375: worker is 1 (out of 1 available) 8186 1726776621.51389: exiting _queue_task() for managed_node1/reboot 8186 1726776621.51399: done queuing things up, now waiting for results queue to drain 8186 1726776621.51402: waiting for pending results... 8382 1726776621.51507: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8382 1726776621.51611: in run() - task 120fa90a-8a95-f1be-6eb1-000000000142 8382 1726776621.51627: variable 'ansible_search_path' from source: unknown 8382 1726776621.51633: variable 'ansible_search_path' from source: unknown 8382 1726776621.51663: calling self._execute() 8382 1726776621.51770: variable 'ansible_host' from source: host vars for 'managed_node1' 8382 1726776621.51779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8382 1726776621.51787: variable 'omit' from source: magic vars 8382 1726776621.52098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8382 1726776621.53694: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8382 1726776621.53748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8382 1726776621.53776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8382 1726776621.53803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8382 1726776621.53824: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8382 1726776621.53877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8382 1726776621.53899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8382 1726776621.53918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8382 1726776621.53949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8382 1726776621.53961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8382 1726776621.54038: variable '__kernel_settings_is_transactional' from source: set_fact 8382 1726776621.54052: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8382 1726776621.54055: when evaluation is False, skipping this task 8382 1726776621.54057: _execute() done 8382 1726776621.54059: dumping result to json 8382 1726776621.54061: done dumping result, returning 8382 1726776621.54067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [120fa90a-8a95-f1be-6eb1-000000000142] 8382 1726776621.54071: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000142 8382 1726776621.54090: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000142 8382 1726776621.54092: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8186 1726776621.54311: no more pending results, returning what we have 8186 1726776621.54313: results queue empty 8186 1726776621.54315: checking for any_errors_fatal 8186 1726776621.54319: done checking for any_errors_fatal 8186 1726776621.54319: checking for max_fail_percentage 8186 1726776621.54320: done checking for max_fail_percentage 8186 1726776621.54320: checking to see if all hosts have failed and the running result is not ok 8186 1726776621.54321: done checking to see if all hosts have failed 8186 1726776621.54321: getting the remaining hosts for this loop 8186 1726776621.54322: done getting the remaining hosts for this loop 8186 1726776621.54324: getting the next task for host managed_node1 8186 1726776621.54330: done getting next task for host managed_node1 8186 1726776621.54333: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8186 1726776621.54335: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776621.54344: getting variables 8186 1726776621.54345: in VariableManager get_vars() 8186 1726776621.54363: Calling all_inventory to load vars for managed_node1 8186 1726776621.54365: Calling groups_inventory to load vars for managed_node1 8186 1726776621.54366: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776621.54372: Calling all_plugins_play to load vars for managed_node1 8186 1726776621.54374: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776621.54375: Calling groups_plugins_play to load vars for managed_node1 8186 1726776621.54467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776621.54583: done with get_vars() 8186 1726776621.54589: done getting variables 8186 1726776621.54626: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 16:10:21 -0400 (0:00:00.034) 0:00:09.418 **** 8186 1726776621.54649: entering _queue_task() for managed_node1/fail 8186 1726776621.54802: worker is 1 (out of 1 available) 8186 1726776621.54817: exiting _queue_task() for managed_node1/fail 8186 1726776621.54827: done queuing things up, now waiting for results queue to drain 8186 1726776621.54831: waiting for pending results... 8383 1726776621.54940: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8383 1726776621.55046: in run() - task 120fa90a-8a95-f1be-6eb1-000000000143 8383 1726776621.55065: variable 'ansible_search_path' from source: unknown 8383 1726776621.55070: variable 'ansible_search_path' from source: unknown 8383 1726776621.55098: calling self._execute() 8383 1726776621.55151: variable 'ansible_host' from source: host vars for 'managed_node1' 8383 1726776621.55159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8383 1726776621.55168: variable 'omit' from source: magic vars 8383 1726776621.55487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8383 1726776621.57144: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8383 1726776621.57191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8383 1726776621.57220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8383 1726776621.57248: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8383 1726776621.57279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8383 1726776621.57336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8383 1726776621.57359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8383 1726776621.57380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8383 1726776621.57407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8383 1726776621.57418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8383 1726776621.57496: variable '__kernel_settings_is_transactional' from source: set_fact 8383 1726776621.57512: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8383 1726776621.57516: when evaluation is False, skipping this task 8383 1726776621.57520: _execute() done 8383 1726776621.57523: dumping result to json 8383 1726776621.57527: done dumping result, returning 8383 1726776621.57534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [120fa90a-8a95-f1be-6eb1-000000000143] 8383 1726776621.57540: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000143 8383 1726776621.57559: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000143 8383 1726776621.57561: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8186 1726776621.57728: no more pending results, returning what we have 8186 1726776621.57732: results queue empty 8186 1726776621.57733: checking for any_errors_fatal 8186 1726776621.57737: done checking for any_errors_fatal 8186 1726776621.57737: checking for max_fail_percentage 8186 1726776621.57739: done checking for max_fail_percentage 8186 1726776621.57740: checking to see if all hosts have failed and the running result is not ok 8186 1726776621.57740: done checking to see if all hosts have failed 8186 1726776621.57741: getting the remaining hosts for this loop 8186 1726776621.57742: done getting the remaining hosts for this loop 8186 1726776621.57745: getting the next task for host managed_node1 8186 1726776621.57755: done getting next task for host managed_node1 8186 1726776621.57758: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8186 1726776621.57761: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776621.57775: getting variables 8186 1726776621.57777: in VariableManager get_vars() 8186 1726776621.57808: Calling all_inventory to load vars for managed_node1 8186 1726776621.57810: Calling groups_inventory to load vars for managed_node1 8186 1726776621.57812: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776621.57820: Calling all_plugins_play to load vars for managed_node1 8186 1726776621.57822: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776621.57824: Calling groups_plugins_play to load vars for managed_node1 8186 1726776621.57924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776621.58211: done with get_vars() 8186 1726776621.58218: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 16:10:21 -0400 (0:00:00.036) 0:00:09.454 **** 8186 1726776621.58275: entering _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8186 1726776621.58276: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8186 1726776621.58444: worker is 1 (out of 1 available) 8186 1726776621.58458: exiting _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8186 1726776621.58469: done queuing things up, now waiting for results queue to drain 8186 1726776621.58471: waiting for pending results... 8384 1726776621.58587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8384 1726776621.58694: in run() - task 120fa90a-8a95-f1be-6eb1-000000000145 8384 1726776621.58712: variable 'ansible_search_path' from source: unknown 8384 1726776621.58716: variable 'ansible_search_path' from source: unknown 8384 1726776621.58745: calling self._execute() 8384 1726776621.58804: variable 'ansible_host' from source: host vars for 'managed_node1' 8384 1726776621.58810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8384 1726776621.58816: variable 'omit' from source: magic vars 8384 1726776621.58888: variable 'omit' from source: magic vars 8384 1726776621.58925: variable 'omit' from source: magic vars 8384 1726776621.58945: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8384 1726776621.59168: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8384 1726776621.59225: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8384 1726776621.59251: variable 'omit' from source: magic vars 8384 1726776621.59282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8384 1726776621.59307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8384 1726776621.59324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8384 1726776621.59336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8384 1726776621.59346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8384 1726776621.59368: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8384 1726776621.59372: variable 'ansible_host' from source: host vars for 'managed_node1' 8384 1726776621.59376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8384 1726776621.59461: Set connection var ansible_shell_executable to /bin/sh 8384 1726776621.59470: Set connection var ansible_timeout to 10 8384 1726776621.59477: Set connection var ansible_module_compression to ZIP_DEFLATED 8384 1726776621.59480: Set connection var ansible_connection to ssh 8384 1726776621.59487: Set connection var ansible_pipelining to False 8384 1726776621.59493: Set connection var ansible_shell_type to sh 8384 1726776621.59509: variable 'ansible_shell_executable' from source: unknown 8384 1726776621.59513: variable 'ansible_connection' from source: unknown 8384 1726776621.59516: variable 'ansible_module_compression' from source: unknown 8384 1726776621.59519: variable 'ansible_shell_type' from source: unknown 8384 1726776621.59523: variable 'ansible_shell_executable' from source: unknown 8384 1726776621.59526: variable 'ansible_host' from source: host vars for 'managed_node1' 8384 1726776621.59532: variable 'ansible_pipelining' from source: unknown 8384 1726776621.59535: variable 'ansible_timeout' from source: unknown 8384 1726776621.59540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8384 1726776621.59660: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8384 1726776621.59672: variable 'omit' from source: magic vars 8384 1726776621.59678: starting attempt loop 8384 1726776621.59681: running the handler 8384 1726776621.59693: _low_level_execute_command(): starting 8384 1726776621.59700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8384 1726776621.62037: stdout chunk (state=2): >>>/root <<< 8384 1726776621.62153: stderr chunk (state=3): >>><<< 8384 1726776621.62159: stdout chunk (state=3): >>><<< 8384 1726776621.62178: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8384 1726776621.62191: _low_level_execute_command(): starting 8384 1726776621.62197: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400 `" && echo ansible-tmp-1726776621.6218624-8384-277416094418400="` echo /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400 `" ) && sleep 0' 8384 1726776621.64648: stdout chunk (state=2): >>>ansible-tmp-1726776621.6218624-8384-277416094418400=/root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400 <<< 8384 1726776621.64779: stderr chunk (state=3): >>><<< 8384 1726776621.64787: stdout chunk (state=3): >>><<< 8384 1726776621.64802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776621.6218624-8384-277416094418400=/root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400 , stderr= 8384 1726776621.64842: variable 'ansible_module_compression' from source: unknown 8384 1726776621.64883: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8384 1726776621.64888: ANSIBALLZ: Acquiring lock 8384 1726776621.64892: ANSIBALLZ: Lock acquired: 140184656148240 8384 1726776621.64896: ANSIBALLZ: Creating module 8384 1726776621.74126: ANSIBALLZ: Writing module into payload 8384 1726776621.74189: ANSIBALLZ: Writing module 8384 1726776621.74211: ANSIBALLZ: Renaming module 8384 1726776621.74218: ANSIBALLZ: Done creating module 8384 1726776621.74239: variable 'ansible_facts' from source: unknown 8384 1726776621.74297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/AnsiballZ_kernel_settings_get_config.py 8384 1726776621.74399: Sending initial data 8384 1726776621.74406: Sent initial data (173 bytes) 8384 1726776621.76974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp4vi_i1c2 /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/AnsiballZ_kernel_settings_get_config.py <<< 8384 1726776621.78010: stderr chunk (state=3): >>><<< 8384 1726776621.78017: stdout chunk (state=3): >>><<< 8384 1726776621.78036: done transferring module to remote 8384 1726776621.78046: _low_level_execute_command(): starting 8384 1726776621.78052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/ /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8384 1726776621.80393: stderr chunk (state=2): >>><<< 8384 1726776621.80399: stdout chunk (state=2): >>><<< 8384 1726776621.80412: _low_level_execute_command() done: rc=0, stdout=, stderr= 8384 1726776621.80416: _low_level_execute_command(): starting 8384 1726776621.80421: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8384 1726776621.96081: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8384 1726776621.97146: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8384 1726776621.97188: stderr chunk (state=3): >>><<< 8384 1726776621.97196: stdout chunk (state=3): >>><<< 8384 1726776621.97212: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.14.221 closed. 8384 1726776621.97238: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8384 1726776621.97249: _low_level_execute_command(): starting 8384 1726776621.97254: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776621.6218624-8384-277416094418400/ > /dev/null 2>&1 && sleep 0' 8384 1726776621.99643: stderr chunk (state=2): >>><<< 8384 1726776621.99653: stdout chunk (state=2): >>><<< 8384 1726776621.99676: _low_level_execute_command() done: rc=0, stdout=, stderr= 8384 1726776621.99684: handler run complete 8384 1726776621.99706: attempt loop complete, returning result 8384 1726776621.99711: _execute() done 8384 1726776621.99713: dumping result to json 8384 1726776621.99716: done dumping result, returning 8384 1726776621.99722: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [120fa90a-8a95-f1be-6eb1-000000000145] 8384 1726776621.99726: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000145 8384 1726776621.99771: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000145 8384 1726776621.99775: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8186 1726776622.00130: no more pending results, returning what we have 8186 1726776622.00134: results queue empty 8186 1726776622.00134: checking for any_errors_fatal 8186 1726776622.00140: done checking for any_errors_fatal 8186 1726776622.00140: checking for max_fail_percentage 8186 1726776622.00142: done checking for max_fail_percentage 8186 1726776622.00142: checking to see if all hosts have failed and the running result is not ok 8186 1726776622.00143: done checking to see if all hosts have failed 8186 1726776622.00144: getting the remaining hosts for this loop 8186 1726776622.00145: done getting the remaining hosts for this loop 8186 1726776622.00148: getting the next task for host managed_node1 8186 1726776622.00154: done getting next task for host managed_node1 8186 1726776622.00157: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8186 1726776622.00163: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776622.00178: getting variables 8186 1726776622.00179: in VariableManager get_vars() 8186 1726776622.00213: Calling all_inventory to load vars for managed_node1 8186 1726776622.00216: Calling groups_inventory to load vars for managed_node1 8186 1726776622.00218: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776622.00228: Calling all_plugins_play to load vars for managed_node1 8186 1726776622.00232: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776622.00235: Calling groups_plugins_play to load vars for managed_node1 8186 1726776622.00407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776622.00623: done with get_vars() 8186 1726776622.00636: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.424) 0:00:09.879 **** 8186 1726776622.00727: entering _queue_task() for managed_node1/stat 8186 1726776622.00901: worker is 1 (out of 1 available) 8186 1726776622.00914: exiting _queue_task() for managed_node1/stat 8186 1726776622.00927: done queuing things up, now waiting for results queue to drain 8186 1726776622.00932: waiting for pending results... 8406 1726776622.01046: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8406 1726776622.01154: in run() - task 120fa90a-8a95-f1be-6eb1-000000000146 8406 1726776622.01171: variable 'ansible_search_path' from source: unknown 8406 1726776622.01176: variable 'ansible_search_path' from source: unknown 8406 1726776622.01212: variable '__prof_from_conf' from source: task vars 8406 1726776622.01443: variable '__prof_from_conf' from source: task vars 8406 1726776622.01575: variable '__data' from source: task vars 8406 1726776622.01634: variable '__kernel_settings_register_tuned_main' from source: set_fact 8406 1726776622.01769: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8406 1726776622.01779: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8406 1726776622.01823: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8406 1726776622.01838: variable 'omit' from source: magic vars 8406 1726776622.01966: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.01976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.01985: variable 'omit' from source: magic vars 8406 1726776622.02156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8406 1726776622.03888: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8406 1726776622.03947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8406 1726776622.03980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8406 1726776622.04006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8406 1726776622.04027: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8406 1726776622.04084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8406 1726776622.04105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8406 1726776622.04124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8406 1726776622.04156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8406 1726776622.04169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8406 1726776622.04237: variable 'item' from source: unknown 8406 1726776622.04249: Evaluated conditional (item | length > 0): False 8406 1726776622.04254: when evaluation is False, skipping this task 8406 1726776622.04277: variable 'item' from source: unknown 8406 1726776622.04321: variable 'item' from source: unknown skipping: [managed_node1] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 8406 1726776622.04398: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.04408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.04417: variable 'omit' from source: magic vars 8406 1726776622.04536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8406 1726776622.04554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8406 1726776622.04572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8406 1726776622.04601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8406 1726776622.04612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8406 1726776622.04667: variable 'item' from source: unknown 8406 1726776622.04676: Evaluated conditional (item | length > 0): True 8406 1726776622.04682: variable 'omit' from source: magic vars 8406 1726776622.04715: variable 'omit' from source: magic vars 8406 1726776622.04746: variable 'item' from source: unknown 8406 1726776622.04790: variable 'item' from source: unknown 8406 1726776622.04804: variable 'omit' from source: magic vars 8406 1726776622.04825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8406 1726776622.04847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8406 1726776622.04862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8406 1726776622.04874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726776622.04881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726776622.04900: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8406 1726776622.04903: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.04905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.04971: Set connection var ansible_shell_executable to /bin/sh 8406 1726776622.04977: Set connection var ansible_timeout to 10 8406 1726776622.04981: Set connection var ansible_module_compression to ZIP_DEFLATED 8406 1726776622.04983: Set connection var ansible_connection to ssh 8406 1726776622.04987: Set connection var ansible_pipelining to False 8406 1726776622.04990: Set connection var ansible_shell_type to sh 8406 1726776622.05000: variable 'ansible_shell_executable' from source: unknown 8406 1726776622.05003: variable 'ansible_connection' from source: unknown 8406 1726776622.05005: variable 'ansible_module_compression' from source: unknown 8406 1726776622.05006: variable 'ansible_shell_type' from source: unknown 8406 1726776622.05008: variable 'ansible_shell_executable' from source: unknown 8406 1726776622.05009: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.05011: variable 'ansible_pipelining' from source: unknown 8406 1726776622.05013: variable 'ansible_timeout' from source: unknown 8406 1726776622.05015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.05100: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8406 1726776622.05108: variable 'omit' from source: magic vars 8406 1726776622.05112: starting attempt loop 8406 1726776622.05114: running the handler 8406 1726776622.05122: _low_level_execute_command(): starting 8406 1726776622.05127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8406 1726776622.07407: stdout chunk (state=2): >>>/root <<< 8406 1726776622.07527: stderr chunk (state=3): >>><<< 8406 1726776622.07536: stdout chunk (state=3): >>><<< 8406 1726776622.07554: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8406 1726776622.07567: _low_level_execute_command(): starting 8406 1726776622.07573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307 `" && echo ansible-tmp-1726776622.0756137-8406-169045314642307="` echo /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307 `" ) && sleep 0' 8406 1726776622.10030: stdout chunk (state=2): >>>ansible-tmp-1726776622.0756137-8406-169045314642307=/root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307 <<< 8406 1726776622.10157: stderr chunk (state=3): >>><<< 8406 1726776622.10164: stdout chunk (state=3): >>><<< 8406 1726776622.10177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776622.0756137-8406-169045314642307=/root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307 , stderr= 8406 1726776622.10214: variable 'ansible_module_compression' from source: unknown 8406 1726776622.10254: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8406 1726776622.10282: variable 'ansible_facts' from source: unknown 8406 1726776622.10349: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/AnsiballZ_stat.py 8406 1726776622.10443: Sending initial data 8406 1726776622.10451: Sent initial data (151 bytes) 8406 1726776622.12897: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmppysphttu /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/AnsiballZ_stat.py <<< 8406 1726776622.13927: stderr chunk (state=3): >>><<< 8406 1726776622.13935: stdout chunk (state=3): >>><<< 8406 1726776622.13953: done transferring module to remote 8406 1726776622.13966: _low_level_execute_command(): starting 8406 1726776622.13972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/ /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/AnsiballZ_stat.py && sleep 0' 8406 1726776622.16281: stderr chunk (state=2): >>><<< 8406 1726776622.16287: stdout chunk (state=2): >>><<< 8406 1726776622.16299: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726776622.16303: _low_level_execute_command(): starting 8406 1726776622.16308: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/AnsiballZ_stat.py && sleep 0' 8406 1726776622.31183: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8406 1726776622.32279: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8406 1726776622.32289: stdout chunk (state=3): >>><<< 8406 1726776622.32302: stderr chunk (state=3): >>><<< 8406 1726776622.32314: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.14.221 closed. 8406 1726776622.32343: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8406 1726776622.32356: _low_level_execute_command(): starting 8406 1726776622.32362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776622.0756137-8406-169045314642307/ > /dev/null 2>&1 && sleep 0' 8406 1726776622.35018: stderr chunk (state=2): >>><<< 8406 1726776622.35028: stdout chunk (state=2): >>><<< 8406 1726776622.35045: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726776622.35053: handler run complete 8406 1726776622.35078: attempt loop complete, returning result 8406 1726776622.35099: variable 'item' from source: unknown 8406 1726776622.35180: variable 'item' from source: unknown ok: [managed_node1] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8406 1726776622.35276: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.35288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.35297: variable 'omit' from source: magic vars 8406 1726776622.35452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8406 1726776622.35481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8406 1726776622.35507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8406 1726776622.35548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8406 1726776622.35563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8406 1726776622.35644: variable 'item' from source: unknown 8406 1726776622.35654: Evaluated conditional (item | length > 0): True 8406 1726776622.35659: variable 'omit' from source: magic vars 8406 1726776622.35675: variable 'omit' from source: magic vars 8406 1726776622.35718: variable 'item' from source: unknown 8406 1726776622.35783: variable 'item' from source: unknown 8406 1726776622.35799: variable 'omit' from source: magic vars 8406 1726776622.35817: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8406 1726776622.35826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726776622.35834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726776622.35847: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8406 1726776622.35851: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.35854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.35927: Set connection var ansible_shell_executable to /bin/sh 8406 1726776622.35937: Set connection var ansible_timeout to 10 8406 1726776622.35943: Set connection var ansible_module_compression to ZIP_DEFLATED 8406 1726776622.35946: Set connection var ansible_connection to ssh 8406 1726776622.35953: Set connection var ansible_pipelining to False 8406 1726776622.35958: Set connection var ansible_shell_type to sh 8406 1726776622.35978: variable 'ansible_shell_executable' from source: unknown 8406 1726776622.35983: variable 'ansible_connection' from source: unknown 8406 1726776622.35986: variable 'ansible_module_compression' from source: unknown 8406 1726776622.35989: variable 'ansible_shell_type' from source: unknown 8406 1726776622.35991: variable 'ansible_shell_executable' from source: unknown 8406 1726776622.35994: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726776622.35997: variable 'ansible_pipelining' from source: unknown 8406 1726776622.36000: variable 'ansible_timeout' from source: unknown 8406 1726776622.36003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726776622.36283: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8406 1726776622.36295: variable 'omit' from source: magic vars 8406 1726776622.36301: starting attempt loop 8406 1726776622.36306: running the handler 8406 1726776622.36314: _low_level_execute_command(): starting 8406 1726776622.36319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8406 1726776622.38730: stdout chunk (state=2): >>>/root <<< 8406 1726776622.38864: stderr chunk (state=3): >>><<< 8406 1726776622.38873: stdout chunk (state=3): >>><<< 8406 1726776622.38889: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8406 1726776622.38901: _low_level_execute_command(): starting 8406 1726776622.38908: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782 `" && echo ansible-tmp-1726776622.3889735-8406-165798380964782="` echo /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782 `" ) && sleep 0' 8406 1726776622.41592: stdout chunk (state=2): >>>ansible-tmp-1726776622.3889735-8406-165798380964782=/root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782 <<< 8406 1726776622.41732: stderr chunk (state=3): >>><<< 8406 1726776622.41739: stdout chunk (state=3): >>><<< 8406 1726776622.41754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776622.3889735-8406-165798380964782=/root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782 , stderr= 8406 1726776622.41788: variable 'ansible_module_compression' from source: unknown 8406 1726776622.41837: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8406 1726776622.41855: variable 'ansible_facts' from source: unknown 8406 1726776622.41941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/AnsiballZ_stat.py 8406 1726776622.42433: Sending initial data 8406 1726776622.42441: Sent initial data (151 bytes) 8406 1726776622.44755: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpxb3xxq3s /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/AnsiballZ_stat.py <<< 8406 1726776622.45948: stderr chunk (state=3): >>><<< 8406 1726776622.45954: stdout chunk (state=3): >>><<< 8406 1726776622.45974: done transferring module to remote 8406 1726776622.45983: _low_level_execute_command(): starting 8406 1726776622.45988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/ /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/AnsiballZ_stat.py && sleep 0' 8406 1726776622.48550: stderr chunk (state=2): >>><<< 8406 1726776622.48558: stdout chunk (state=2): >>><<< 8406 1726776622.48577: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726776622.48582: _low_level_execute_command(): starting 8406 1726776622.48587: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/AnsiballZ_stat.py && sleep 0' 8406 1726776622.64418: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726776440.2400825, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8406 1726776622.65546: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8406 1726776622.65592: stderr chunk (state=3): >>><<< 8406 1726776622.65600: stdout chunk (state=3): >>><<< 8406 1726776622.65616: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726776440.2400825, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.14.221 closed. 8406 1726776622.65674: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8406 1726776622.65684: _low_level_execute_command(): starting 8406 1726776622.65691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776622.3889735-8406-165798380964782/ > /dev/null 2>&1 && sleep 0' 8406 1726776622.68074: stderr chunk (state=2): >>><<< 8406 1726776622.68082: stdout chunk (state=2): >>><<< 8406 1726776622.68096: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726776622.68103: handler run complete 8406 1726776622.68138: attempt loop complete, returning result 8406 1726776622.68154: variable 'item' from source: unknown 8406 1726776622.68215: variable 'item' from source: unknown ok: [managed_node1] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726776440.2400825, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968741.377, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1716968741.377, "nlink": 3, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 136, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8406 1726776622.68259: dumping result to json 8406 1726776622.68272: done dumping result, returning 8406 1726776622.68280: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [120fa90a-8a95-f1be-6eb1-000000000146] 8406 1726776622.68286: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000146 8406 1726776622.68323: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000146 8406 1726776622.68327: WORKER PROCESS EXITING 8186 1726776622.68568: no more pending results, returning what we have 8186 1726776622.68571: results queue empty 8186 1726776622.68572: checking for any_errors_fatal 8186 1726776622.68575: done checking for any_errors_fatal 8186 1726776622.68576: checking for max_fail_percentage 8186 1726776622.68577: done checking for max_fail_percentage 8186 1726776622.68578: checking to see if all hosts have failed and the running result is not ok 8186 1726776622.68578: done checking to see if all hosts have failed 8186 1726776622.68579: getting the remaining hosts for this loop 8186 1726776622.68580: done getting the remaining hosts for this loop 8186 1726776622.68582: getting the next task for host managed_node1 8186 1726776622.68587: done getting next task for host managed_node1 8186 1726776622.68591: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8186 1726776622.68594: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776622.68603: getting variables 8186 1726776622.68604: in VariableManager get_vars() 8186 1726776622.68624: Calling all_inventory to load vars for managed_node1 8186 1726776622.68625: Calling groups_inventory to load vars for managed_node1 8186 1726776622.68626: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776622.68636: Calling all_plugins_play to load vars for managed_node1 8186 1726776622.68637: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776622.68639: Calling groups_plugins_play to load vars for managed_node1 8186 1726776622.68746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776622.68863: done with get_vars() 8186 1726776622.68872: done getting variables 8186 1726776622.68910: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.682) 0:00:10.561 **** 8186 1726776622.68936: entering _queue_task() for managed_node1/set_fact 8186 1726776622.69096: worker is 1 (out of 1 available) 8186 1726776622.69109: exiting _queue_task() for managed_node1/set_fact 8186 1726776622.69121: done queuing things up, now waiting for results queue to drain 8186 1726776622.69123: waiting for pending results... 8432 1726776622.69320: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8432 1726776622.69444: in run() - task 120fa90a-8a95-f1be-6eb1-000000000147 8432 1726776622.69463: variable 'ansible_search_path' from source: unknown 8432 1726776622.69467: variable 'ansible_search_path' from source: unknown 8432 1726776622.69495: calling self._execute() 8432 1726776622.69565: variable 'ansible_host' from source: host vars for 'managed_node1' 8432 1726776622.69575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8432 1726776622.69583: variable 'omit' from source: magic vars 8432 1726776622.69676: variable 'omit' from source: magic vars 8432 1726776622.69731: variable 'omit' from source: magic vars 8432 1726776622.70158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8432 1726776622.72381: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8432 1726776622.72455: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8432 1726776622.72490: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8432 1726776622.72540: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8432 1726776622.72566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8432 1726776622.72638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8432 1726776622.72666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8432 1726776622.72696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8432 1726776622.72737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8432 1726776622.72750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8432 1726776622.72783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8432 1726776622.72799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8432 1726776622.72813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8432 1726776622.72841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8432 1726776622.72852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8432 1726776622.72899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8432 1726776622.72916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8432 1726776622.72936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8432 1726776622.72961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8432 1726776622.72973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8432 1726776622.73123: variable '__kernel_settings_find_profile_dirs' from source: set_fact 8432 1726776622.73187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8432 1726776622.73295: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8432 1726776622.73323: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8432 1726776622.73346: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8432 1726776622.73364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8432 1726776622.73391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8432 1726776622.73405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8432 1726776622.73421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8432 1726776622.73451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8432 1726776622.73489: variable 'omit' from source: magic vars 8432 1726776622.73511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8432 1726776622.73531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8432 1726776622.73545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8432 1726776622.73558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8432 1726776622.73566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8432 1726776622.73587: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8432 1726776622.73590: variable 'ansible_host' from source: host vars for 'managed_node1' 8432 1726776622.73593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8432 1726776622.73660: Set connection var ansible_shell_executable to /bin/sh 8432 1726776622.73667: Set connection var ansible_timeout to 10 8432 1726776622.73671: Set connection var ansible_module_compression to ZIP_DEFLATED 8432 1726776622.73673: Set connection var ansible_connection to ssh 8432 1726776622.73677: Set connection var ansible_pipelining to False 8432 1726776622.73680: Set connection var ansible_shell_type to sh 8432 1726776622.73694: variable 'ansible_shell_executable' from source: unknown 8432 1726776622.73697: variable 'ansible_connection' from source: unknown 8432 1726776622.73699: variable 'ansible_module_compression' from source: unknown 8432 1726776622.73700: variable 'ansible_shell_type' from source: unknown 8432 1726776622.73702: variable 'ansible_shell_executable' from source: unknown 8432 1726776622.73704: variable 'ansible_host' from source: host vars for 'managed_node1' 8432 1726776622.73706: variable 'ansible_pipelining' from source: unknown 8432 1726776622.73707: variable 'ansible_timeout' from source: unknown 8432 1726776622.73709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8432 1726776622.73769: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8432 1726776622.73779: variable 'omit' from source: magic vars 8432 1726776622.73782: starting attempt loop 8432 1726776622.73784: running the handler 8432 1726776622.73792: handler run complete 8432 1726776622.73797: attempt loop complete, returning result 8432 1726776622.73799: _execute() done 8432 1726776622.73801: dumping result to json 8432 1726776622.73803: done dumping result, returning 8432 1726776622.73807: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [120fa90a-8a95-f1be-6eb1-000000000147] 8432 1726776622.73811: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000147 8432 1726776622.73826: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000147 8432 1726776622.73827: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8186 1726776622.74004: no more pending results, returning what we have 8186 1726776622.74007: results queue empty 8186 1726776622.74007: checking for any_errors_fatal 8186 1726776622.74014: done checking for any_errors_fatal 8186 1726776622.74015: checking for max_fail_percentage 8186 1726776622.74016: done checking for max_fail_percentage 8186 1726776622.74017: checking to see if all hosts have failed and the running result is not ok 8186 1726776622.74017: done checking to see if all hosts have failed 8186 1726776622.74018: getting the remaining hosts for this loop 8186 1726776622.74019: done getting the remaining hosts for this loop 8186 1726776622.74022: getting the next task for host managed_node1 8186 1726776622.74027: done getting next task for host managed_node1 8186 1726776622.74033: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8186 1726776622.74036: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776622.74045: getting variables 8186 1726776622.74046: in VariableManager get_vars() 8186 1726776622.74076: Calling all_inventory to load vars for managed_node1 8186 1726776622.74078: Calling groups_inventory to load vars for managed_node1 8186 1726776622.74079: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776622.74086: Calling all_plugins_play to load vars for managed_node1 8186 1726776622.74088: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776622.74089: Calling groups_plugins_play to load vars for managed_node1 8186 1726776622.74194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776622.74338: done with get_vars() 8186 1726776622.74346: done getting variables 8186 1726776622.74413: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 16:10:22 -0400 (0:00:00.055) 0:00:10.616 **** 8186 1726776622.74437: entering _queue_task() for managed_node1/service 8186 1726776622.74438: Creating lock for service 8186 1726776622.74623: worker is 1 (out of 1 available) 8186 1726776622.74638: exiting _queue_task() for managed_node1/service 8186 1726776622.74650: done queuing things up, now waiting for results queue to drain 8186 1726776622.74652: waiting for pending results... 8435 1726776622.74773: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8435 1726776622.74881: in run() - task 120fa90a-8a95-f1be-6eb1-000000000148 8435 1726776622.74896: variable 'ansible_search_path' from source: unknown 8435 1726776622.74900: variable 'ansible_search_path' from source: unknown 8435 1726776622.74935: variable '__kernel_settings_services' from source: include_vars 8435 1726776622.75165: variable '__kernel_settings_services' from source: include_vars 8435 1726776622.75221: variable 'omit' from source: magic vars 8435 1726776622.75311: variable 'ansible_host' from source: host vars for 'managed_node1' 8435 1726776622.75319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8435 1726776622.75325: variable 'omit' from source: magic vars 8435 1726776622.75388: variable 'omit' from source: magic vars 8435 1726776622.75420: variable 'omit' from source: magic vars 8435 1726776622.75449: variable 'item' from source: unknown 8435 1726776622.75502: variable 'item' from source: unknown 8435 1726776622.75518: variable 'omit' from source: magic vars 8435 1726776622.75549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8435 1726776622.75575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8435 1726776622.75592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8435 1726776622.75605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8435 1726776622.75615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8435 1726776622.75643: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8435 1726776622.75647: variable 'ansible_host' from source: host vars for 'managed_node1' 8435 1726776622.75651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8435 1726776622.75716: Set connection var ansible_shell_executable to /bin/sh 8435 1726776622.75726: Set connection var ansible_timeout to 10 8435 1726776622.75735: Set connection var ansible_module_compression to ZIP_DEFLATED 8435 1726776622.75739: Set connection var ansible_connection to ssh 8435 1726776622.75746: Set connection var ansible_pipelining to False 8435 1726776622.75750: Set connection var ansible_shell_type to sh 8435 1726776622.75767: variable 'ansible_shell_executable' from source: unknown 8435 1726776622.75773: variable 'ansible_connection' from source: unknown 8435 1726776622.75777: variable 'ansible_module_compression' from source: unknown 8435 1726776622.75781: variable 'ansible_shell_type' from source: unknown 8435 1726776622.75783: variable 'ansible_shell_executable' from source: unknown 8435 1726776622.75786: variable 'ansible_host' from source: host vars for 'managed_node1' 8435 1726776622.75791: variable 'ansible_pipelining' from source: unknown 8435 1726776622.75794: variable 'ansible_timeout' from source: unknown 8435 1726776622.75798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8435 1726776622.75913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8435 1726776622.75925: variable 'omit' from source: magic vars 8435 1726776622.75932: starting attempt loop 8435 1726776622.75935: running the handler 8435 1726776622.76012: variable 'ansible_facts' from source: unknown 8435 1726776622.76123: _low_level_execute_command(): starting 8435 1726776622.76135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8435 1726776622.78471: stdout chunk (state=2): >>>/root <<< 8435 1726776622.78596: stderr chunk (state=3): >>><<< 8435 1726776622.78603: stdout chunk (state=3): >>><<< 8435 1726776622.78622: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8435 1726776622.78636: _low_level_execute_command(): starting 8435 1726776622.78643: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010 `" && echo ansible-tmp-1726776622.7863078-8435-252251545174010="` echo /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010 `" ) && sleep 0' 8435 1726776622.81481: stdout chunk (state=2): >>>ansible-tmp-1726776622.7863078-8435-252251545174010=/root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010 <<< 8435 1726776622.81615: stderr chunk (state=3): >>><<< 8435 1726776622.81621: stdout chunk (state=3): >>><<< 8435 1726776622.81636: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776622.7863078-8435-252251545174010=/root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010 , stderr= 8435 1726776622.81668: variable 'ansible_module_compression' from source: unknown 8435 1726776622.81722: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8435 1726776622.81727: ANSIBALLZ: Acquiring lock 8435 1726776622.81732: ANSIBALLZ: Lock acquired: 140184657595568 8435 1726776622.81737: ANSIBALLZ: Creating module 8435 1726776623.03216: ANSIBALLZ: Writing module into payload 8435 1726776623.03362: ANSIBALLZ: Writing module 8435 1726776623.03386: ANSIBALLZ: Renaming module 8435 1726776623.03392: ANSIBALLZ: Done creating module 8435 1726776623.03406: variable 'ansible_facts' from source: unknown 8435 1726776623.03543: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/AnsiballZ_systemd.py 8435 1726776623.03651: Sending initial data 8435 1726776623.03658: Sent initial data (154 bytes) 8435 1726776623.06429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpeopv4ocl /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/AnsiballZ_systemd.py <<< 8435 1726776623.08734: stderr chunk (state=3): >>><<< 8435 1726776623.08741: stdout chunk (state=3): >>><<< 8435 1726776623.08759: done transferring module to remote 8435 1726776623.08769: _low_level_execute_command(): starting 8435 1726776623.08775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/ /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/AnsiballZ_systemd.py && sleep 0' 8435 1726776623.11100: stderr chunk (state=2): >>><<< 8435 1726776623.11110: stdout chunk (state=2): >>><<< 8435 1726776623.11128: _low_level_execute_command() done: rc=0, stdout=, stderr= 8435 1726776623.11135: _low_level_execute_command(): starting 8435 1726776623.11140: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/AnsiballZ_systemd.py && sleep 0' 8435 1726776623.39616: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "676", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18616320", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh":<<< 8435 1726776623.39661: stdout chunk (state=3): >>> "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8435 1726776623.41294: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8435 1726776623.41344: stderr chunk (state=3): >>><<< 8435 1726776623.41350: stdout chunk (state=3): >>><<< 8435 1726776623.41371: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "676", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18616320", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8435 1726776623.41476: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8435 1726776623.41493: _low_level_execute_command(): starting 8435 1726776623.41499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776622.7863078-8435-252251545174010/ > /dev/null 2>&1 && sleep 0' 8435 1726776623.43907: stderr chunk (state=2): >>><<< 8435 1726776623.43914: stdout chunk (state=2): >>><<< 8435 1726776623.43927: _low_level_execute_command() done: rc=0, stdout=, stderr= 8435 1726776623.43936: handler run complete 8435 1726776623.43969: attempt loop complete, returning result 8435 1726776623.43985: variable 'item' from source: unknown 8435 1726776623.44062: variable 'item' from source: unknown ok: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "676", "MemoryAccounting": "yes", "MemoryCurrent": "18616320", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "WatchdogUSec": "0" } } 8435 1726776623.44167: dumping result to json 8435 1726776623.44186: done dumping result, returning 8435 1726776623.44194: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [120fa90a-8a95-f1be-6eb1-000000000148] 8435 1726776623.44200: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000148 8435 1726776623.44304: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000148 8435 1726776623.44308: WORKER PROCESS EXITING 8186 1726776623.44672: no more pending results, returning what we have 8186 1726776623.44675: results queue empty 8186 1726776623.44677: checking for any_errors_fatal 8186 1726776623.44681: done checking for any_errors_fatal 8186 1726776623.44682: checking for max_fail_percentage 8186 1726776623.44683: done checking for max_fail_percentage 8186 1726776623.44684: checking to see if all hosts have failed and the running result is not ok 8186 1726776623.44684: done checking to see if all hosts have failed 8186 1726776623.44685: getting the remaining hosts for this loop 8186 1726776623.44686: done getting the remaining hosts for this loop 8186 1726776623.44689: getting the next task for host managed_node1 8186 1726776623.44694: done getting next task for host managed_node1 8186 1726776623.44697: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8186 1726776623.44701: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776623.44710: getting variables 8186 1726776623.44712: in VariableManager get_vars() 8186 1726776623.44741: Calling all_inventory to load vars for managed_node1 8186 1726776623.44743: Calling groups_inventory to load vars for managed_node1 8186 1726776623.44745: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776623.44754: Calling all_plugins_play to load vars for managed_node1 8186 1726776623.44757: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776623.44760: Calling groups_plugins_play to load vars for managed_node1 8186 1726776623.44937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776623.45145: done with get_vars() 8186 1726776623.45156: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 16:10:23 -0400 (0:00:00.708) 0:00:11.324 **** 8186 1726776623.45246: entering _queue_task() for managed_node1/file 8186 1726776623.45448: worker is 1 (out of 1 available) 8186 1726776623.45463: exiting _queue_task() for managed_node1/file 8186 1726776623.45477: done queuing things up, now waiting for results queue to drain 8186 1726776623.45479: waiting for pending results... 8463 1726776623.45604: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8463 1726776623.45712: in run() - task 120fa90a-8a95-f1be-6eb1-000000000149 8463 1726776623.45731: variable 'ansible_search_path' from source: unknown 8463 1726776623.45735: variable 'ansible_search_path' from source: unknown 8463 1726776623.45762: calling self._execute() 8463 1726776623.45819: variable 'ansible_host' from source: host vars for 'managed_node1' 8463 1726776623.45828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8463 1726776623.45839: variable 'omit' from source: magic vars 8463 1726776623.45910: variable 'omit' from source: magic vars 8463 1726776623.45952: variable 'omit' from source: magic vars 8463 1726776623.45973: variable '__kernel_settings_profile_dir' from source: role '' all vars 8463 1726776623.46184: variable '__kernel_settings_profile_dir' from source: role '' all vars 8463 1726776623.46251: variable '__kernel_settings_profile_parent' from source: set_fact 8463 1726776623.46260: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8463 1726776623.46290: variable 'omit' from source: magic vars 8463 1726776623.46321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8463 1726776623.46349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8463 1726776623.46367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8463 1726776623.46382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8463 1726776623.46393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8463 1726776623.46415: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8463 1726776623.46421: variable 'ansible_host' from source: host vars for 'managed_node1' 8463 1726776623.46425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8463 1726776623.46496: Set connection var ansible_shell_executable to /bin/sh 8463 1726776623.46504: Set connection var ansible_timeout to 10 8463 1726776623.46510: Set connection var ansible_module_compression to ZIP_DEFLATED 8463 1726776623.46513: Set connection var ansible_connection to ssh 8463 1726776623.46520: Set connection var ansible_pipelining to False 8463 1726776623.46526: Set connection var ansible_shell_type to sh 8463 1726776623.46542: variable 'ansible_shell_executable' from source: unknown 8463 1726776623.46545: variable 'ansible_connection' from source: unknown 8463 1726776623.46549: variable 'ansible_module_compression' from source: unknown 8463 1726776623.46553: variable 'ansible_shell_type' from source: unknown 8463 1726776623.46556: variable 'ansible_shell_executable' from source: unknown 8463 1726776623.46559: variable 'ansible_host' from source: host vars for 'managed_node1' 8463 1726776623.46563: variable 'ansible_pipelining' from source: unknown 8463 1726776623.46566: variable 'ansible_timeout' from source: unknown 8463 1726776623.46570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8463 1726776623.46703: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8463 1726776623.46714: variable 'omit' from source: magic vars 8463 1726776623.46718: starting attempt loop 8463 1726776623.46720: running the handler 8463 1726776623.46730: _low_level_execute_command(): starting 8463 1726776623.46736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8463 1726776623.49036: stdout chunk (state=2): >>>/root <<< 8463 1726776623.49179: stderr chunk (state=3): >>><<< 8463 1726776623.49187: stdout chunk (state=3): >>><<< 8463 1726776623.49207: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8463 1726776623.49224: _low_level_execute_command(): starting 8463 1726776623.49233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358 `" && echo ansible-tmp-1726776623.4921749-8463-143271229062358="` echo /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358 `" ) && sleep 0' 8463 1726776623.52157: stdout chunk (state=2): >>>ansible-tmp-1726776623.4921749-8463-143271229062358=/root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358 <<< 8463 1726776623.52536: stderr chunk (state=3): >>><<< 8463 1726776623.52543: stdout chunk (state=3): >>><<< 8463 1726776623.52557: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776623.4921749-8463-143271229062358=/root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358 , stderr= 8463 1726776623.52594: variable 'ansible_module_compression' from source: unknown 8463 1726776623.52638: ANSIBALLZ: Using lock for file 8463 1726776623.52644: ANSIBALLZ: Acquiring lock 8463 1726776623.52648: ANSIBALLZ: Lock acquired: 140184657596720 8463 1726776623.52655: ANSIBALLZ: Creating module 8463 1726776623.66326: ANSIBALLZ: Writing module into payload 8463 1726776623.66542: ANSIBALLZ: Writing module 8463 1726776623.66565: ANSIBALLZ: Renaming module 8463 1726776623.66576: ANSIBALLZ: Done creating module 8463 1726776623.66592: variable 'ansible_facts' from source: unknown 8463 1726776623.66689: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/AnsiballZ_file.py 8463 1726776623.67162: Sending initial data 8463 1726776623.67172: Sent initial data (151 bytes) 8463 1726776623.70537: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpc794fvzw /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/AnsiballZ_file.py <<< 8463 1726776623.71734: stderr chunk (state=3): >>><<< 8463 1726776623.71744: stdout chunk (state=3): >>><<< 8463 1726776623.71765: done transferring module to remote 8463 1726776623.71781: _low_level_execute_command(): starting 8463 1726776623.71788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/ /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/AnsiballZ_file.py && sleep 0' 8463 1726776623.74619: stderr chunk (state=2): >>><<< 8463 1726776623.74634: stdout chunk (state=2): >>><<< 8463 1726776623.74653: _low_level_execute_command() done: rc=0, stdout=, stderr= 8463 1726776623.74659: _low_level_execute_command(): starting 8463 1726776623.74664: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/AnsiballZ_file.py && sleep 0' 8463 1726776623.90782: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8463 1726776623.91900: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8463 1726776623.91948: stderr chunk (state=3): >>><<< 8463 1726776623.91956: stdout chunk (state=3): >>><<< 8463 1726776623.91972: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8463 1726776623.92002: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8463 1726776623.92011: _low_level_execute_command(): starting 8463 1726776623.92015: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776623.4921749-8463-143271229062358/ > /dev/null 2>&1 && sleep 0' 8463 1726776623.94417: stderr chunk (state=2): >>><<< 8463 1726776623.94425: stdout chunk (state=2): >>><<< 8463 1726776623.94440: _low_level_execute_command() done: rc=0, stdout=, stderr= 8463 1726776623.94447: handler run complete 8463 1726776623.94469: attempt loop complete, returning result 8463 1726776623.94474: _execute() done 8463 1726776623.94477: dumping result to json 8463 1726776623.94483: done dumping result, returning 8463 1726776623.94491: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [120fa90a-8a95-f1be-6eb1-000000000149] 8463 1726776623.94496: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000149 8463 1726776623.94528: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000149 8463 1726776623.94532: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8186 1726776623.94681: no more pending results, returning what we have 8186 1726776623.94684: results queue empty 8186 1726776623.94685: checking for any_errors_fatal 8186 1726776623.94698: done checking for any_errors_fatal 8186 1726776623.94699: checking for max_fail_percentage 8186 1726776623.94700: done checking for max_fail_percentage 8186 1726776623.94700: checking to see if all hosts have failed and the running result is not ok 8186 1726776623.94701: done checking to see if all hosts have failed 8186 1726776623.94701: getting the remaining hosts for this loop 8186 1726776623.94703: done getting the remaining hosts for this loop 8186 1726776623.94706: getting the next task for host managed_node1 8186 1726776623.94713: done getting next task for host managed_node1 8186 1726776623.94717: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8186 1726776623.94720: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776623.94731: getting variables 8186 1726776623.94733: in VariableManager get_vars() 8186 1726776623.94763: Calling all_inventory to load vars for managed_node1 8186 1726776623.94766: Calling groups_inventory to load vars for managed_node1 8186 1726776623.94768: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776623.94777: Calling all_plugins_play to load vars for managed_node1 8186 1726776623.94779: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776623.94781: Calling groups_plugins_play to load vars for managed_node1 8186 1726776623.94905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776623.95024: done with get_vars() 8186 1726776623.95035: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 16:10:23 -0400 (0:00:00.498) 0:00:11.822 **** 8186 1726776623.95104: entering _queue_task() for managed_node1/slurp 8186 1726776623.95310: worker is 1 (out of 1 available) 8186 1726776623.95325: exiting _queue_task() for managed_node1/slurp 8186 1726776623.95338: done queuing things up, now waiting for results queue to drain 8186 1726776623.95341: waiting for pending results... 8480 1726776623.95549: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8480 1726776623.95685: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014a 8480 1726776623.95705: variable 'ansible_search_path' from source: unknown 8480 1726776623.95710: variable 'ansible_search_path' from source: unknown 8480 1726776623.95742: calling self._execute() 8480 1726776623.95810: variable 'ansible_host' from source: host vars for 'managed_node1' 8480 1726776623.95819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8480 1726776623.95827: variable 'omit' from source: magic vars 8480 1726776623.95921: variable 'omit' from source: magic vars 8480 1726776623.95971: variable 'omit' from source: magic vars 8480 1726776623.95997: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8480 1726776623.96273: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8480 1726776623.96349: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8480 1726776623.96379: variable 'omit' from source: magic vars 8480 1726776623.96417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8480 1726776623.96500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8480 1726776623.96521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8480 1726776623.96543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8480 1726776623.96556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8480 1726776623.96584: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8480 1726776623.96590: variable 'ansible_host' from source: host vars for 'managed_node1' 8480 1726776623.96594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8480 1726776623.96686: Set connection var ansible_shell_executable to /bin/sh 8480 1726776623.96695: Set connection var ansible_timeout to 10 8480 1726776623.96701: Set connection var ansible_module_compression to ZIP_DEFLATED 8480 1726776623.96704: Set connection var ansible_connection to ssh 8480 1726776623.96711: Set connection var ansible_pipelining to False 8480 1726776623.96717: Set connection var ansible_shell_type to sh 8480 1726776623.96736: variable 'ansible_shell_executable' from source: unknown 8480 1726776623.96741: variable 'ansible_connection' from source: unknown 8480 1726776623.96744: variable 'ansible_module_compression' from source: unknown 8480 1726776623.96747: variable 'ansible_shell_type' from source: unknown 8480 1726776623.96750: variable 'ansible_shell_executable' from source: unknown 8480 1726776623.96753: variable 'ansible_host' from source: host vars for 'managed_node1' 8480 1726776623.96757: variable 'ansible_pipelining' from source: unknown 8480 1726776623.96759: variable 'ansible_timeout' from source: unknown 8480 1726776623.96763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8480 1726776623.96944: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8480 1726776623.96957: variable 'omit' from source: magic vars 8480 1726776623.96964: starting attempt loop 8480 1726776623.96967: running the handler 8480 1726776623.96979: _low_level_execute_command(): starting 8480 1726776623.96987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8480 1726776623.99518: stdout chunk (state=2): >>>/root <<< 8480 1726776623.99639: stderr chunk (state=3): >>><<< 8480 1726776623.99645: stdout chunk (state=3): >>><<< 8480 1726776623.99662: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8480 1726776623.99677: _low_level_execute_command(): starting 8480 1726776623.99683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412 `" && echo ansible-tmp-1726776623.9967234-8480-20459121323412="` echo /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412 `" ) && sleep 0' 8480 1726776624.02148: stdout chunk (state=2): >>>ansible-tmp-1726776623.9967234-8480-20459121323412=/root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412 <<< 8480 1726776624.02282: stderr chunk (state=3): >>><<< 8480 1726776624.02291: stdout chunk (state=3): >>><<< 8480 1726776624.02307: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776623.9967234-8480-20459121323412=/root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412 , stderr= 8480 1726776624.02348: variable 'ansible_module_compression' from source: unknown 8480 1726776624.02383: ANSIBALLZ: Using lock for slurp 8480 1726776624.02389: ANSIBALLZ: Acquiring lock 8480 1726776624.02392: ANSIBALLZ: Lock acquired: 140184657595856 8480 1726776624.02396: ANSIBALLZ: Creating module 8480 1726776624.10812: ANSIBALLZ: Writing module into payload 8480 1726776624.10868: ANSIBALLZ: Writing module 8480 1726776624.10887: ANSIBALLZ: Renaming module 8480 1726776624.10893: ANSIBALLZ: Done creating module 8480 1726776624.10908: variable 'ansible_facts' from source: unknown 8480 1726776624.10964: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/AnsiballZ_slurp.py 8480 1726776624.11061: Sending initial data 8480 1726776624.11067: Sent initial data (151 bytes) 8480 1726776624.13648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpo2dso_w_ /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/AnsiballZ_slurp.py <<< 8480 1726776624.14672: stderr chunk (state=3): >>><<< 8480 1726776624.14683: stdout chunk (state=3): >>><<< 8480 1726776624.14698: done transferring module to remote 8480 1726776624.14707: _low_level_execute_command(): starting 8480 1726776624.14712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/ /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/AnsiballZ_slurp.py && sleep 0' 8480 1726776624.17038: stderr chunk (state=2): >>><<< 8480 1726776624.17045: stdout chunk (state=2): >>><<< 8480 1726776624.17058: _low_level_execute_command() done: rc=0, stdout=, stderr= 8480 1726776624.17062: _low_level_execute_command(): starting 8480 1726776624.17067: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/AnsiballZ_slurp.py && sleep 0' 8480 1726776624.31880: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8480 1726776624.32905: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8480 1726776624.32955: stderr chunk (state=3): >>><<< 8480 1726776624.32962: stdout chunk (state=3): >>><<< 8480 1726776624.32978: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.14.221 closed. 8480 1726776624.33000: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8480 1726776624.33010: _low_level_execute_command(): starting 8480 1726776624.33016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776623.9967234-8480-20459121323412/ > /dev/null 2>&1 && sleep 0' 8480 1726776624.35419: stderr chunk (state=2): >>><<< 8480 1726776624.35426: stdout chunk (state=2): >>><<< 8480 1726776624.35441: _low_level_execute_command() done: rc=0, stdout=, stderr= 8480 1726776624.35448: handler run complete 8480 1726776624.35460: attempt loop complete, returning result 8480 1726776624.35464: _execute() done 8480 1726776624.35468: dumping result to json 8480 1726776624.35472: done dumping result, returning 8480 1726776624.35479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [120fa90a-8a95-f1be-6eb1-00000000014a] 8480 1726776624.35486: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014a 8480 1726776624.35515: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014a 8480 1726776624.35519: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8186 1726776624.35672: no more pending results, returning what we have 8186 1726776624.35675: results queue empty 8186 1726776624.35676: checking for any_errors_fatal 8186 1726776624.35684: done checking for any_errors_fatal 8186 1726776624.35685: checking for max_fail_percentage 8186 1726776624.35686: done checking for max_fail_percentage 8186 1726776624.35687: checking to see if all hosts have failed and the running result is not ok 8186 1726776624.35687: done checking to see if all hosts have failed 8186 1726776624.35688: getting the remaining hosts for this loop 8186 1726776624.35689: done getting the remaining hosts for this loop 8186 1726776624.35692: getting the next task for host managed_node1 8186 1726776624.35700: done getting next task for host managed_node1 8186 1726776624.35702: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8186 1726776624.35706: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776624.35716: getting variables 8186 1726776624.35717: in VariableManager get_vars() 8186 1726776624.35750: Calling all_inventory to load vars for managed_node1 8186 1726776624.35753: Calling groups_inventory to load vars for managed_node1 8186 1726776624.35754: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776624.35763: Calling all_plugins_play to load vars for managed_node1 8186 1726776624.35765: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776624.35769: Calling groups_plugins_play to load vars for managed_node1 8186 1726776624.35886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776624.36006: done with get_vars() 8186 1726776624.36014: done getting variables 8186 1726776624.36057: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.409) 0:00:12.232 **** 8186 1726776624.36083: entering _queue_task() for managed_node1/set_fact 8186 1726776624.36252: worker is 1 (out of 1 available) 8186 1726776624.36264: exiting _queue_task() for managed_node1/set_fact 8186 1726776624.36279: done queuing things up, now waiting for results queue to drain 8186 1726776624.36282: waiting for pending results... 8493 1726776624.36392: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8493 1726776624.36498: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014b 8493 1726776624.36516: variable 'ansible_search_path' from source: unknown 8493 1726776624.36520: variable 'ansible_search_path' from source: unknown 8493 1726776624.36549: calling self._execute() 8493 1726776624.36678: variable 'ansible_host' from source: host vars for 'managed_node1' 8493 1726776624.36687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8493 1726776624.36695: variable 'omit' from source: magic vars 8493 1726776624.36769: variable 'omit' from source: magic vars 8493 1726776624.36805: variable 'omit' from source: magic vars 8493 1726776624.37079: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8493 1726776624.37088: variable '__cur_profile' from source: task vars 8493 1726776624.37193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8493 1726776624.38652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8493 1726776624.38706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8493 1726776624.38736: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8493 1726776624.38762: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8493 1726776624.38783: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8493 1726776624.38839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8493 1726776624.38860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8493 1726776624.38879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8493 1726776624.38905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8493 1726776624.38915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8493 1726776624.38989: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8493 1726776624.39021: variable 'omit' from source: magic vars 8493 1726776624.39041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8493 1726776624.39059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8493 1726776624.39075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8493 1726776624.39086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8493 1726776624.39093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8493 1726776624.39114: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8493 1726776624.39118: variable 'ansible_host' from source: host vars for 'managed_node1' 8493 1726776624.39121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8493 1726776624.39186: Set connection var ansible_shell_executable to /bin/sh 8493 1726776624.39192: Set connection var ansible_timeout to 10 8493 1726776624.39195: Set connection var ansible_module_compression to ZIP_DEFLATED 8493 1726776624.39197: Set connection var ansible_connection to ssh 8493 1726776624.39201: Set connection var ansible_pipelining to False 8493 1726776624.39204: Set connection var ansible_shell_type to sh 8493 1726776624.39217: variable 'ansible_shell_executable' from source: unknown 8493 1726776624.39220: variable 'ansible_connection' from source: unknown 8493 1726776624.39222: variable 'ansible_module_compression' from source: unknown 8493 1726776624.39224: variable 'ansible_shell_type' from source: unknown 8493 1726776624.39225: variable 'ansible_shell_executable' from source: unknown 8493 1726776624.39227: variable 'ansible_host' from source: host vars for 'managed_node1' 8493 1726776624.39232: variable 'ansible_pipelining' from source: unknown 8493 1726776624.39234: variable 'ansible_timeout' from source: unknown 8493 1726776624.39236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8493 1726776624.39291: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8493 1726776624.39300: variable 'omit' from source: magic vars 8493 1726776624.39303: starting attempt loop 8493 1726776624.39305: running the handler 8493 1726776624.39312: handler run complete 8493 1726776624.39319: attempt loop complete, returning result 8493 1726776624.39321: _execute() done 8493 1726776624.39324: dumping result to json 8493 1726776624.39326: done dumping result, returning 8493 1726776624.39332: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [120fa90a-8a95-f1be-6eb1-00000000014b] 8493 1726776624.39337: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014b 8493 1726776624.39353: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014b 8493 1726776624.39355: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8186 1726776624.39761: no more pending results, returning what we have 8186 1726776624.39763: results queue empty 8186 1726776624.39763: checking for any_errors_fatal 8186 1726776624.39765: done checking for any_errors_fatal 8186 1726776624.39766: checking for max_fail_percentage 8186 1726776624.39767: done checking for max_fail_percentage 8186 1726776624.39769: checking to see if all hosts have failed and the running result is not ok 8186 1726776624.39769: done checking to see if all hosts have failed 8186 1726776624.39770: getting the remaining hosts for this loop 8186 1726776624.39771: done getting the remaining hosts for this loop 8186 1726776624.39773: getting the next task for host managed_node1 8186 1726776624.39778: done getting next task for host managed_node1 8186 1726776624.39780: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8186 1726776624.39782: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776624.39792: getting variables 8186 1726776624.39793: in VariableManager get_vars() 8186 1726776624.39813: Calling all_inventory to load vars for managed_node1 8186 1726776624.39814: Calling groups_inventory to load vars for managed_node1 8186 1726776624.39816: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776624.39821: Calling all_plugins_play to load vars for managed_node1 8186 1726776624.39823: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776624.39825: Calling groups_plugins_play to load vars for managed_node1 8186 1726776624.39917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776624.40038: done with get_vars() 8186 1726776624.40045: done getting variables 8186 1726776624.40126: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 16:10:24 -0400 (0:00:00.040) 0:00:12.273 **** 8186 1726776624.40150: entering _queue_task() for managed_node1/copy 8186 1726776624.40307: worker is 1 (out of 1 available) 8186 1726776624.40319: exiting _queue_task() for managed_node1/copy 8186 1726776624.40332: done queuing things up, now waiting for results queue to drain 8186 1726776624.40335: waiting for pending results... 8494 1726776624.40445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8494 1726776624.40548: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014c 8494 1726776624.40563: variable 'ansible_search_path' from source: unknown 8494 1726776624.40566: variable 'ansible_search_path' from source: unknown 8494 1726776624.40591: calling self._execute() 8494 1726776624.40645: variable 'ansible_host' from source: host vars for 'managed_node1' 8494 1726776624.40652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8494 1726776624.40658: variable 'omit' from source: magic vars 8494 1726776624.40723: variable 'omit' from source: magic vars 8494 1726776624.40760: variable 'omit' from source: magic vars 8494 1726776624.40780: variable '__kernel_settings_active_profile' from source: set_fact 8494 1726776624.40986: variable '__kernel_settings_active_profile' from source: set_fact 8494 1726776624.41008: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8494 1726776624.41058: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8494 1726776624.41111: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8494 1726776624.41133: variable 'omit' from source: magic vars 8494 1726776624.41162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8494 1726776624.41187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8494 1726776624.41204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8494 1726776624.41217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8494 1726776624.41224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8494 1726776624.41246: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8494 1726776624.41250: variable 'ansible_host' from source: host vars for 'managed_node1' 8494 1726776624.41253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8494 1726776624.41316: Set connection var ansible_shell_executable to /bin/sh 8494 1726776624.41322: Set connection var ansible_timeout to 10 8494 1726776624.41327: Set connection var ansible_module_compression to ZIP_DEFLATED 8494 1726776624.41344: Set connection var ansible_connection to ssh 8494 1726776624.41352: Set connection var ansible_pipelining to False 8494 1726776624.41357: Set connection var ansible_shell_type to sh 8494 1726776624.41371: variable 'ansible_shell_executable' from source: unknown 8494 1726776624.41375: variable 'ansible_connection' from source: unknown 8494 1726776624.41379: variable 'ansible_module_compression' from source: unknown 8494 1726776624.41382: variable 'ansible_shell_type' from source: unknown 8494 1726776624.41385: variable 'ansible_shell_executable' from source: unknown 8494 1726776624.41388: variable 'ansible_host' from source: host vars for 'managed_node1' 8494 1726776624.41392: variable 'ansible_pipelining' from source: unknown 8494 1726776624.41396: variable 'ansible_timeout' from source: unknown 8494 1726776624.41400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8494 1726776624.41513: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8494 1726776624.41523: variable 'omit' from source: magic vars 8494 1726776624.41527: starting attempt loop 8494 1726776624.41532: running the handler 8494 1726776624.41540: _low_level_execute_command(): starting 8494 1726776624.41548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8494 1726776624.43848: stdout chunk (state=2): >>>/root <<< 8494 1726776624.43968: stderr chunk (state=3): >>><<< 8494 1726776624.43974: stdout chunk (state=3): >>><<< 8494 1726776624.43992: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8494 1726776624.44006: _low_level_execute_command(): starting 8494 1726776624.44011: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139 `" && echo ansible-tmp-1726776624.440012-8494-256626382557139="` echo /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139 `" ) && sleep 0' 8494 1726776624.46448: stdout chunk (state=2): >>>ansible-tmp-1726776624.440012-8494-256626382557139=/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139 <<< 8494 1726776624.46581: stderr chunk (state=3): >>><<< 8494 1726776624.46588: stdout chunk (state=3): >>><<< 8494 1726776624.46603: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776624.440012-8494-256626382557139=/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139 , stderr= 8494 1726776624.46677: variable 'ansible_module_compression' from source: unknown 8494 1726776624.46719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8494 1726776624.46751: variable 'ansible_facts' from source: unknown 8494 1726776624.46818: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_stat.py 8494 1726776624.46905: Sending initial data 8494 1726776624.46912: Sent initial data (150 bytes) 8494 1726776624.49428: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp9k5uyfb3 /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_stat.py <<< 8494 1726776624.50540: stderr chunk (state=3): >>><<< 8494 1726776624.50547: stdout chunk (state=3): >>><<< 8494 1726776624.50567: done transferring module to remote 8494 1726776624.50581: _low_level_execute_command(): starting 8494 1726776624.50589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/ /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_stat.py && sleep 0' 8494 1726776624.53835: stderr chunk (state=2): >>><<< 8494 1726776624.53844: stdout chunk (state=2): >>><<< 8494 1726776624.53860: _low_level_execute_command() done: rc=0, stdout=, stderr= 8494 1726776624.53865: _low_level_execute_command(): starting 8494 1726776624.53870: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_stat.py && sleep 0' 8494 1726776624.70562: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776624.3177333, "mtime": 1726776440.3730824, "ctime": 1726776440.3730824, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8494 1726776624.72193: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8494 1726776624.72203: stdout chunk (state=3): >>><<< 8494 1726776624.72214: stderr chunk (state=3): >>><<< 8494 1726776624.72231: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726776624.3177333, "mtime": 1726776440.3730824, "ctime": 1726776440.3730824, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.14.221 closed. 8494 1726776624.72286: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8494 1726776624.72795: Sending initial data 8494 1726776624.72802: Sent initial data (139 bytes) 8494 1726776624.77849: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpv18cyvmn /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source <<< 8494 1726776624.86068: stderr chunk (state=3): >>><<< 8494 1726776624.86080: stdout chunk (state=3): >>><<< 8494 1726776624.86105: _low_level_execute_command(): starting 8494 1726776624.86114: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/ /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source && sleep 0' 8494 1726776624.88699: stderr chunk (state=2): >>><<< 8494 1726776624.88709: stdout chunk (state=2): >>><<< 8494 1726776624.88724: _low_level_execute_command() done: rc=0, stdout=, stderr= 8494 1726776624.88745: variable 'ansible_module_compression' from source: unknown 8494 1726776624.88781: ANSIBALLZ: Using generic lock for ansible.legacy.copy 8494 1726776624.88786: ANSIBALLZ: Acquiring lock 8494 1726776624.88789: ANSIBALLZ: Lock acquired: 140184657595568 8494 1726776624.88793: ANSIBALLZ: Creating module 8494 1726776625.01265: ANSIBALLZ: Writing module into payload 8494 1726776625.01454: ANSIBALLZ: Writing module 8494 1726776625.01476: ANSIBALLZ: Renaming module 8494 1726776625.01485: ANSIBALLZ: Done creating module 8494 1726776625.01498: variable 'ansible_facts' from source: unknown 8494 1726776625.01610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_copy.py 8494 1726776625.01965: Sending initial data 8494 1726776625.01973: Sent initial data (150 bytes) 8494 1726776625.04446: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp9_u0eeox /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_copy.py <<< 8494 1726776625.05533: stderr chunk (state=3): >>><<< 8494 1726776625.05542: stdout chunk (state=3): >>><<< 8494 1726776625.05561: done transferring module to remote 8494 1726776625.05571: _low_level_execute_command(): starting 8494 1726776625.05577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/ /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_copy.py && sleep 0' 8494 1726776625.07935: stderr chunk (state=2): >>><<< 8494 1726776625.07946: stdout chunk (state=2): >>><<< 8494 1726776625.07964: _low_level_execute_command() done: rc=0, stdout=, stderr= 8494 1726776625.07970: _low_level_execute_command(): starting 8494 1726776625.07976: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/AnsiballZ_copy.py && sleep 0' 8494 1726776625.24193: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source", "_original_basename": "tmpv18cyvmn", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8494 1726776625.25383: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8494 1726776625.25401: stdout chunk (state=3): >>><<< 8494 1726776625.25414: stderr chunk (state=3): >>><<< 8494 1726776625.25431: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source", "_original_basename": "tmpv18cyvmn", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8494 1726776625.25468: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source', '_original_basename': 'tmpv18cyvmn', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8494 1726776625.25483: _low_level_execute_command(): starting 8494 1726776625.25490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/ > /dev/null 2>&1 && sleep 0' 8494 1726776625.29872: stderr chunk (state=2): >>><<< 8494 1726776625.29885: stdout chunk (state=2): >>><<< 8494 1726776625.29902: _low_level_execute_command() done: rc=0, stdout=, stderr= 8494 1726776625.29911: handler run complete 8494 1726776625.29940: attempt loop complete, returning result 8494 1726776625.29946: _execute() done 8494 1726776625.29949: dumping result to json 8494 1726776625.29955: done dumping result, returning 8494 1726776625.29963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [120fa90a-8a95-f1be-6eb1-00000000014c] 8494 1726776625.29969: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014c 8494 1726776625.30011: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014c 8494 1726776625.30015: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726776624.440012-8494-256626382557139/source", "state": "file", "uid": 0 } 8186 1726776625.30818: no more pending results, returning what we have 8186 1726776625.30821: results queue empty 8186 1726776625.30822: checking for any_errors_fatal 8186 1726776625.30827: done checking for any_errors_fatal 8186 1726776625.30827: checking for max_fail_percentage 8186 1726776625.30833: done checking for max_fail_percentage 8186 1726776625.30834: checking to see if all hosts have failed and the running result is not ok 8186 1726776625.30835: done checking to see if all hosts have failed 8186 1726776625.30836: getting the remaining hosts for this loop 8186 1726776625.30837: done getting the remaining hosts for this loop 8186 1726776625.30840: getting the next task for host managed_node1 8186 1726776625.30846: done getting next task for host managed_node1 8186 1726776625.30849: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8186 1726776625.30852: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776625.30861: getting variables 8186 1726776625.30863: in VariableManager get_vars() 8186 1726776625.30894: Calling all_inventory to load vars for managed_node1 8186 1726776625.30897: Calling groups_inventory to load vars for managed_node1 8186 1726776625.30900: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776625.30908: Calling all_plugins_play to load vars for managed_node1 8186 1726776625.30911: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776625.30913: Calling groups_plugins_play to load vars for managed_node1 8186 1726776625.31112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776625.31313: done with get_vars() 8186 1726776625.31323: done getting variables 8186 1726776625.31383: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.912) 0:00:13.185 **** 8186 1726776625.31413: entering _queue_task() for managed_node1/copy 8186 1726776625.31616: worker is 1 (out of 1 available) 8186 1726776625.31631: exiting _queue_task() for managed_node1/copy 8186 1726776625.31642: done queuing things up, now waiting for results queue to drain 8186 1726776625.31644: waiting for pending results... 8551 1726776625.32121: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8551 1726776625.32258: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014d 8551 1726776625.32284: variable 'ansible_search_path' from source: unknown 8551 1726776625.32290: variable 'ansible_search_path' from source: unknown 8551 1726776625.32322: calling self._execute() 8551 1726776625.32396: variable 'ansible_host' from source: host vars for 'managed_node1' 8551 1726776625.32407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8551 1726776625.32415: variable 'omit' from source: magic vars 8551 1726776625.32512: variable 'omit' from source: magic vars 8551 1726776625.32568: variable 'omit' from source: magic vars 8551 1726776625.32596: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8551 1726776625.32874: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8551 1726776625.32951: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8551 1726776625.32983: variable 'omit' from source: magic vars 8551 1726776625.33021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8551 1726776625.33199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8551 1726776625.33221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8551 1726776625.33241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8551 1726776625.33253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8551 1726776625.33285: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8551 1726776625.33291: variable 'ansible_host' from source: host vars for 'managed_node1' 8551 1726776625.33296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8551 1726776625.33397: Set connection var ansible_shell_executable to /bin/sh 8551 1726776625.33405: Set connection var ansible_timeout to 10 8551 1726776625.33411: Set connection var ansible_module_compression to ZIP_DEFLATED 8551 1726776625.33414: Set connection var ansible_connection to ssh 8551 1726776625.33421: Set connection var ansible_pipelining to False 8551 1726776625.33426: Set connection var ansible_shell_type to sh 8551 1726776625.33448: variable 'ansible_shell_executable' from source: unknown 8551 1726776625.33452: variable 'ansible_connection' from source: unknown 8551 1726776625.33456: variable 'ansible_module_compression' from source: unknown 8551 1726776625.33459: variable 'ansible_shell_type' from source: unknown 8551 1726776625.33462: variable 'ansible_shell_executable' from source: unknown 8551 1726776625.33465: variable 'ansible_host' from source: host vars for 'managed_node1' 8551 1726776625.33468: variable 'ansible_pipelining' from source: unknown 8551 1726776625.33471: variable 'ansible_timeout' from source: unknown 8551 1726776625.33475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8551 1726776625.33592: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8551 1726776625.33604: variable 'omit' from source: magic vars 8551 1726776625.33609: starting attempt loop 8551 1726776625.33613: running the handler 8551 1726776625.33624: _low_level_execute_command(): starting 8551 1726776625.33635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8551 1726776625.36767: stdout chunk (state=2): >>>/root <<< 8551 1726776625.36921: stderr chunk (state=3): >>><<< 8551 1726776625.36932: stdout chunk (state=3): >>><<< 8551 1726776625.36954: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8551 1726776625.36968: _low_level_execute_command(): starting 8551 1726776625.36974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053 `" && echo ansible-tmp-1726776625.369624-8551-265003330622053="` echo /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053 `" ) && sleep 0' 8551 1726776625.39598: stdout chunk (state=2): >>>ansible-tmp-1726776625.369624-8551-265003330622053=/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053 <<< 8551 1726776625.39786: stderr chunk (state=3): >>><<< 8551 1726776625.39794: stdout chunk (state=3): >>><<< 8551 1726776625.39813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776625.369624-8551-265003330622053=/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053 , stderr= 8551 1726776625.39899: variable 'ansible_module_compression' from source: unknown 8551 1726776625.39960: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8551 1726776625.39997: variable 'ansible_facts' from source: unknown 8551 1726776625.40100: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_stat.py 8551 1726776625.40546: Sending initial data 8551 1726776625.40553: Sent initial data (150 bytes) 8551 1726776625.42906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpqhpnn9ct /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_stat.py <<< 8551 1726776625.44459: stderr chunk (state=3): >>><<< 8551 1726776625.44468: stdout chunk (state=3): >>><<< 8551 1726776625.44489: done transferring module to remote 8551 1726776625.44500: _low_level_execute_command(): starting 8551 1726776625.44506: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/ /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_stat.py && sleep 0' 8551 1726776625.47010: stderr chunk (state=2): >>><<< 8551 1726776625.47017: stdout chunk (state=2): >>><<< 8551 1726776625.47033: _low_level_execute_command() done: rc=0, stdout=, stderr= 8551 1726776625.47040: _low_level_execute_command(): starting 8551 1726776625.47045: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_stat.py && sleep 0' 8551 1726776625.62908: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776439.9770825, "mtime": 1726776440.3730824, "ctime": 1726776440.3730824, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8551 1726776625.64064: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8551 1726776625.64106: stderr chunk (state=3): >>><<< 8551 1726776625.64113: stdout chunk (state=3): >>><<< 8551 1726776625.64130: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726776439.9770825, "mtime": 1726776440.3730824, "ctime": 1726776440.3730824, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.14.221 closed. 8551 1726776625.64176: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8551 1726776625.64257: Sending initial data 8551 1726776625.64264: Sent initial data (139 bytes) 8551 1726776625.66899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpwjdyima8 /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source <<< 8551 1726776625.67218: stderr chunk (state=3): >>><<< 8551 1726776625.67224: stdout chunk (state=3): >>><<< 8551 1726776625.67244: _low_level_execute_command(): starting 8551 1726776625.67250: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/ /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source && sleep 0' 8551 1726776625.69532: stderr chunk (state=2): >>><<< 8551 1726776625.69539: stdout chunk (state=2): >>><<< 8551 1726776625.69552: _low_level_execute_command() done: rc=0, stdout=, stderr= 8551 1726776625.69573: variable 'ansible_module_compression' from source: unknown 8551 1726776625.69606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8551 1726776625.69623: variable 'ansible_facts' from source: unknown 8551 1726776625.69682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_copy.py 8551 1726776625.69762: Sending initial data 8551 1726776625.69769: Sent initial data (150 bytes) 8551 1726776625.72186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpgxeomm4n /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_copy.py <<< 8551 1726776625.73257: stderr chunk (state=3): >>><<< 8551 1726776625.73265: stdout chunk (state=3): >>><<< 8551 1726776625.73284: done transferring module to remote 8551 1726776625.73294: _low_level_execute_command(): starting 8551 1726776625.73299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/ /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_copy.py && sleep 0' 8551 1726776625.75653: stderr chunk (state=2): >>><<< 8551 1726776625.75661: stdout chunk (state=2): >>><<< 8551 1726776625.75676: _low_level_execute_command() done: rc=0, stdout=, stderr= 8551 1726776625.75680: _low_level_execute_command(): starting 8551 1726776625.75685: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/AnsiballZ_copy.py && sleep 0' 8551 1726776625.92076: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source", "_original_basename": "tmpwjdyima8", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8551 1726776625.93339: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8551 1726776625.93385: stderr chunk (state=3): >>><<< 8551 1726776625.93393: stdout chunk (state=3): >>><<< 8551 1726776625.93413: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source", "_original_basename": "tmpwjdyima8", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8551 1726776625.93451: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source', '_original_basename': 'tmpwjdyima8', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8551 1726776625.93466: _low_level_execute_command(): starting 8551 1726776625.93473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/ > /dev/null 2>&1 && sleep 0' 8551 1726776625.96033: stderr chunk (state=2): >>><<< 8551 1726776625.96041: stdout chunk (state=2): >>><<< 8551 1726776625.96054: _low_level_execute_command() done: rc=0, stdout=, stderr= 8551 1726776625.96072: handler run complete 8551 1726776625.96097: attempt loop complete, returning result 8551 1726776625.96102: _execute() done 8551 1726776625.96107: dumping result to json 8551 1726776625.96112: done dumping result, returning 8551 1726776625.96120: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [120fa90a-8a95-f1be-6eb1-00000000014d] 8551 1726776625.96129: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014d 8551 1726776625.96163: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014d 8551 1726776625.96167: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726776625.369624-8551-265003330622053/source", "state": "file", "uid": 0 } 8186 1726776625.96422: no more pending results, returning what we have 8186 1726776625.96425: results queue empty 8186 1726776625.96426: checking for any_errors_fatal 8186 1726776625.96433: done checking for any_errors_fatal 8186 1726776625.96434: checking for max_fail_percentage 8186 1726776625.96435: done checking for max_fail_percentage 8186 1726776625.96436: checking to see if all hosts have failed and the running result is not ok 8186 1726776625.96436: done checking to see if all hosts have failed 8186 1726776625.96437: getting the remaining hosts for this loop 8186 1726776625.96438: done getting the remaining hosts for this loop 8186 1726776625.96441: getting the next task for host managed_node1 8186 1726776625.96448: done getting next task for host managed_node1 8186 1726776625.96451: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8186 1726776625.96454: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776625.96465: getting variables 8186 1726776625.96466: in VariableManager get_vars() 8186 1726776625.96499: Calling all_inventory to load vars for managed_node1 8186 1726776625.96502: Calling groups_inventory to load vars for managed_node1 8186 1726776625.96504: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776625.96512: Calling all_plugins_play to load vars for managed_node1 8186 1726776625.96515: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776625.96518: Calling groups_plugins_play to load vars for managed_node1 8186 1726776625.96685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776625.96887: done with get_vars() 8186 1726776625.96898: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 16:10:25 -0400 (0:00:00.655) 0:00:13.841 **** 8186 1726776625.96978: entering _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8186 1726776625.97194: worker is 1 (out of 1 available) 8186 1726776625.97209: exiting _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8186 1726776625.97222: done queuing things up, now waiting for results queue to drain 8186 1726776625.97225: waiting for pending results... 8586 1726776625.97349: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get current config 8586 1726776625.97467: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014e 8586 1726776625.97487: variable 'ansible_search_path' from source: unknown 8586 1726776625.97491: variable 'ansible_search_path' from source: unknown 8586 1726776625.97518: calling self._execute() 8586 1726776625.97583: variable 'ansible_host' from source: host vars for 'managed_node1' 8586 1726776625.97592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8586 1726776625.97601: variable 'omit' from source: magic vars 8586 1726776625.97678: variable 'omit' from source: magic vars 8586 1726776625.97716: variable 'omit' from source: magic vars 8586 1726776625.97739: variable '__kernel_settings_profile_filename' from source: role '' all vars 8586 1726776625.97956: variable '__kernel_settings_profile_filename' from source: role '' all vars 8586 1726776625.98072: variable '__kernel_settings_profile_dir' from source: role '' all vars 8586 1726776625.98141: variable '__kernel_settings_profile_parent' from source: set_fact 8586 1726776625.98149: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8586 1726776625.98181: variable 'omit' from source: magic vars 8586 1726776625.98217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8586 1726776625.98246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8586 1726776625.98264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8586 1726776625.98279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8586 1726776625.98290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8586 1726776625.98314: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8586 1726776625.98320: variable 'ansible_host' from source: host vars for 'managed_node1' 8586 1726776625.98324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8586 1726776625.98394: Set connection var ansible_shell_executable to /bin/sh 8586 1726776625.98402: Set connection var ansible_timeout to 10 8586 1726776625.98408: Set connection var ansible_module_compression to ZIP_DEFLATED 8586 1726776625.98412: Set connection var ansible_connection to ssh 8586 1726776625.98419: Set connection var ansible_pipelining to False 8586 1726776625.98425: Set connection var ansible_shell_type to sh 8586 1726776625.98441: variable 'ansible_shell_executable' from source: unknown 8586 1726776625.98445: variable 'ansible_connection' from source: unknown 8586 1726776625.98447: variable 'ansible_module_compression' from source: unknown 8586 1726776625.98449: variable 'ansible_shell_type' from source: unknown 8586 1726776625.98451: variable 'ansible_shell_executable' from source: unknown 8586 1726776625.98453: variable 'ansible_host' from source: host vars for 'managed_node1' 8586 1726776625.98455: variable 'ansible_pipelining' from source: unknown 8586 1726776625.98457: variable 'ansible_timeout' from source: unknown 8586 1726776625.98459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8586 1726776625.98584: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8586 1726776625.98593: variable 'omit' from source: magic vars 8586 1726776625.98597: starting attempt loop 8586 1726776625.98599: running the handler 8586 1726776625.98607: _low_level_execute_command(): starting 8586 1726776625.98613: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8586 1726776626.01344: stdout chunk (state=2): >>>/root <<< 8586 1726776626.01463: stderr chunk (state=3): >>><<< 8586 1726776626.01472: stdout chunk (state=3): >>><<< 8586 1726776626.01491: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8586 1726776626.01505: _low_level_execute_command(): starting 8586 1726776626.01513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620 `" && echo ansible-tmp-1726776626.0149927-8586-35383575164620="` echo /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620 `" ) && sleep 0' 8586 1726776626.04234: stdout chunk (state=2): >>>ansible-tmp-1726776626.0149927-8586-35383575164620=/root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620 <<< 8586 1726776626.04248: stderr chunk (state=2): >>><<< 8586 1726776626.04260: stdout chunk (state=3): >>><<< 8586 1726776626.04276: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776626.0149927-8586-35383575164620=/root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620 , stderr= 8586 1726776626.04313: variable 'ansible_module_compression' from source: unknown 8586 1726776626.04351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 8586 1726776626.04391: variable 'ansible_facts' from source: unknown 8586 1726776626.04491: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/AnsiballZ_kernel_settings_get_config.py 8586 1726776626.04668: Sending initial data 8586 1726776626.04677: Sent initial data (172 bytes) 8586 1726776626.07751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpltjy10xs /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/AnsiballZ_kernel_settings_get_config.py <<< 8586 1726776626.08850: stderr chunk (state=3): >>><<< 8586 1726776626.08859: stdout chunk (state=3): >>><<< 8586 1726776626.08879: done transferring module to remote 8586 1726776626.08890: _low_level_execute_command(): starting 8586 1726776626.08896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/ /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8586 1726776626.11184: stderr chunk (state=2): >>><<< 8586 1726776626.11191: stdout chunk (state=2): >>><<< 8586 1726776626.11205: _low_level_execute_command() done: rc=0, stdout=, stderr= 8586 1726776626.11209: _low_level_execute_command(): starting 8586 1726776626.11214: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8586 1726776626.26546: stdout chunk (state=2): >>> {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 8586 1726776626.27601: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8586 1726776626.27649: stderr chunk (state=3): >>><<< 8586 1726776626.27656: stdout chunk (state=3): >>><<< 8586 1726776626.27675: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.14.221 closed. 8586 1726776626.27697: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8586 1726776626.27708: _low_level_execute_command(): starting 8586 1726776626.27714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776626.0149927-8586-35383575164620/ > /dev/null 2>&1 && sleep 0' 8586 1726776626.30117: stderr chunk (state=2): >>><<< 8586 1726776626.30125: stdout chunk (state=2): >>><<< 8586 1726776626.30142: _low_level_execute_command() done: rc=0, stdout=, stderr= 8586 1726776626.30150: handler run complete 8586 1726776626.30163: attempt loop complete, returning result 8586 1726776626.30166: _execute() done 8586 1726776626.30170: dumping result to json 8586 1726776626.30174: done dumping result, returning 8586 1726776626.30181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get current config [120fa90a-8a95-f1be-6eb1-00000000014e] 8586 1726776626.30188: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014e 8586 1726776626.30214: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014e 8586 1726776626.30218: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "data": {} } 8186 1726776626.30347: no more pending results, returning what we have 8186 1726776626.30350: results queue empty 8186 1726776626.30351: checking for any_errors_fatal 8186 1726776626.30357: done checking for any_errors_fatal 8186 1726776626.30357: checking for max_fail_percentage 8186 1726776626.30358: done checking for max_fail_percentage 8186 1726776626.30359: checking to see if all hosts have failed and the running result is not ok 8186 1726776626.30360: done checking to see if all hosts have failed 8186 1726776626.30360: getting the remaining hosts for this loop 8186 1726776626.30361: done getting the remaining hosts for this loop 8186 1726776626.30364: getting the next task for host managed_node1 8186 1726776626.30370: done getting next task for host managed_node1 8186 1726776626.30375: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8186 1726776626.30378: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776626.30388: getting variables 8186 1726776626.30389: in VariableManager get_vars() 8186 1726776626.30420: Calling all_inventory to load vars for managed_node1 8186 1726776626.30423: Calling groups_inventory to load vars for managed_node1 8186 1726776626.30425: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776626.30435: Calling all_plugins_play to load vars for managed_node1 8186 1726776626.30438: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776626.30440: Calling groups_plugins_play to load vars for managed_node1 8186 1726776626.30588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776626.30707: done with get_vars() 8186 1726776626.30714: done getting variables 8186 1726776626.30798: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 16:10:26 -0400 (0:00:00.338) 0:00:14.180 **** 8186 1726776626.30821: entering _queue_task() for managed_node1/template 8186 1726776626.30822: Creating lock for template 8186 1726776626.30999: worker is 1 (out of 1 available) 8186 1726776626.31013: exiting _queue_task() for managed_node1/template 8186 1726776626.31025: done queuing things up, now waiting for results queue to drain 8186 1726776626.31027: waiting for pending results... 8605 1726776626.31144: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8605 1726776626.31252: in run() - task 120fa90a-8a95-f1be-6eb1-00000000014f 8605 1726776626.31268: variable 'ansible_search_path' from source: unknown 8605 1726776626.31272: variable 'ansible_search_path' from source: unknown 8605 1726776626.31298: calling self._execute() 8605 1726776626.31358: variable 'ansible_host' from source: host vars for 'managed_node1' 8605 1726776626.31368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8605 1726776626.31375: variable 'omit' from source: magic vars 8605 1726776626.31444: variable 'omit' from source: magic vars 8605 1726776626.31481: variable 'omit' from source: magic vars 8605 1726776626.31704: variable '__kernel_settings_profile_src' from source: role '' all vars 8605 1726776626.31712: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8605 1726776626.31767: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8605 1726776626.31786: variable '__kernel_settings_profile_filename' from source: role '' all vars 8605 1726776626.31834: variable '__kernel_settings_profile_filename' from source: role '' all vars 8605 1726776626.31882: variable '__kernel_settings_profile_dir' from source: role '' all vars 8605 1726776626.31943: variable '__kernel_settings_profile_parent' from source: set_fact 8605 1726776626.31951: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8605 1726776626.31973: variable 'omit' from source: magic vars 8605 1726776626.32004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8605 1726776626.32034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8605 1726776626.32052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8605 1726776626.32066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8605 1726776626.32077: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8605 1726776626.32100: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8605 1726776626.32105: variable 'ansible_host' from source: host vars for 'managed_node1' 8605 1726776626.32109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8605 1726776626.32184: Set connection var ansible_shell_executable to /bin/sh 8605 1726776626.32191: Set connection var ansible_timeout to 10 8605 1726776626.32197: Set connection var ansible_module_compression to ZIP_DEFLATED 8605 1726776626.32201: Set connection var ansible_connection to ssh 8605 1726776626.32207: Set connection var ansible_pipelining to False 8605 1726776626.32212: Set connection var ansible_shell_type to sh 8605 1726776626.32227: variable 'ansible_shell_executable' from source: unknown 8605 1726776626.32233: variable 'ansible_connection' from source: unknown 8605 1726776626.32237: variable 'ansible_module_compression' from source: unknown 8605 1726776626.32241: variable 'ansible_shell_type' from source: unknown 8605 1726776626.32244: variable 'ansible_shell_executable' from source: unknown 8605 1726776626.32248: variable 'ansible_host' from source: host vars for 'managed_node1' 8605 1726776626.32252: variable 'ansible_pipelining' from source: unknown 8605 1726776626.32256: variable 'ansible_timeout' from source: unknown 8605 1726776626.32259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8605 1726776626.32344: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8605 1726776626.32353: variable 'omit' from source: magic vars 8605 1726776626.32357: starting attempt loop 8605 1726776626.32359: running the handler 8605 1726776626.32368: _low_level_execute_command(): starting 8605 1726776626.32375: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8605 1726776626.34618: stdout chunk (state=2): >>>/root <<< 8605 1726776626.34735: stderr chunk (state=3): >>><<< 8605 1726776626.34741: stdout chunk (state=3): >>><<< 8605 1726776626.34760: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8605 1726776626.34774: _low_level_execute_command(): starting 8605 1726776626.34780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964 `" && echo ansible-tmp-1726776626.347685-8605-109828471636964="` echo /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964 `" ) && sleep 0' 8605 1726776626.37434: stdout chunk (state=2): >>>ansible-tmp-1726776626.347685-8605-109828471636964=/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964 <<< 8605 1726776626.37445: stderr chunk (state=2): >>><<< 8605 1726776626.37457: stdout chunk (state=3): >>><<< 8605 1726776626.37469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776626.347685-8605-109828471636964=/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964 , stderr= 8605 1726776626.37487: evaluation_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 8605 1726776626.37510: search_path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 8605 1726776626.37539: variable 'ansible_search_path' from source: unknown 8605 1726776626.38487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8605 1726776626.40531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8605 1726776626.40598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8605 1726776626.40636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8605 1726776626.40670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8605 1726776626.40696: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8605 1726776626.40984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8605 1726776626.41011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8605 1726776626.41041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8605 1726776626.41080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8605 1726776626.41094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8605 1726776626.41455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8605 1726776626.41479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8605 1726776626.41501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8605 1726776626.41542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8605 1726776626.41556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8605 1726776626.41937: variable 'ansible_managed' from source: unknown 8605 1726776626.41947: variable '__sections' from source: task vars 8605 1726776626.42070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8605 1726776626.42094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8605 1726776626.42118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8605 1726776626.42158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8605 1726776626.42175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8605 1726776626.42274: variable 'kernel_settings_sysctl' from source: include params 8605 1726776626.42281: variable '__kernel_settings_state_empty' from source: role '' all vars 8605 1726776626.42287: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8605 1726776626.42320: variable '__sysctl_old' from source: task vars 8605 1726776626.42381: variable '__sysctl_old' from source: task vars 8605 1726776626.42592: variable 'kernel_settings_purge' from source: include params 8605 1726776626.42599: variable 'kernel_settings_sysctl' from source: include params 8605 1726776626.42604: variable '__kernel_settings_state_empty' from source: role '' all vars 8605 1726776626.42608: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8605 1726776626.42612: variable '__kernel_settings_profile_contents' from source: set_fact 8605 1726776626.42792: variable 'kernel_settings_sysfs' from source: include params 8605 1726776626.42798: variable '__kernel_settings_state_empty' from source: role '' all vars 8605 1726776626.42803: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8605 1726776626.42818: variable '__sysfs_old' from source: task vars 8605 1726776626.42876: variable '__sysfs_old' from source: task vars 8605 1726776626.43095: variable 'kernel_settings_purge' from source: include params 8605 1726776626.43102: variable 'kernel_settings_sysfs' from source: include params 8605 1726776626.43106: variable '__kernel_settings_state_empty' from source: role '' all vars 8605 1726776626.43111: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8605 1726776626.43116: variable '__kernel_settings_profile_contents' from source: set_fact 8605 1726776626.43134: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 8605 1726776626.43142: variable '__systemd_old' from source: task vars 8605 1726776626.43197: variable '__systemd_old' from source: task vars 8605 1726776626.43396: variable 'kernel_settings_purge' from source: include params 8605 1726776626.43404: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 8605 1726776626.43408: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.43413: variable '__kernel_settings_profile_contents' from source: set_fact 8605 1726776626.43423: variable 'kernel_settings_transparent_hugepages' from source: include params 8605 1726776626.43428: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 8605 1726776626.43434: variable '__trans_huge_old' from source: task vars 8605 1726776626.43488: variable '__trans_huge_old' from source: task vars 8605 1726776626.43686: variable 'kernel_settings_purge' from source: include params 8605 1726776626.43693: variable 'kernel_settings_transparent_hugepages' from source: include params 8605 1726776626.43698: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.43703: variable '__kernel_settings_profile_contents' from source: set_fact 8605 1726776626.43712: variable '__trans_defrag_old' from source: task vars 8605 1726776626.43770: variable '__trans_defrag_old' from source: task vars 8605 1726776626.43967: variable 'kernel_settings_purge' from source: include params 8605 1726776626.43974: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 8605 1726776626.43979: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.43984: variable '__kernel_settings_profile_contents' from source: set_fact 8605 1726776626.44003: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.44016: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.44023: variable '__kernel_settings_state_absent' from source: role '' all vars 8605 1726776626.45316: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8605 1726776626.45369: variable 'ansible_module_compression' from source: unknown 8605 1726776626.45421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8605 1726776626.45446: variable 'ansible_facts' from source: unknown 8605 1726776626.45539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_stat.py 8605 1726776626.45990: Sending initial data 8605 1726776626.45998: Sent initial data (150 bytes) 8605 1726776626.49223: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpjfs7ef8j /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_stat.py <<< 8605 1726776626.50835: stderr chunk (state=3): >>><<< 8605 1726776626.50844: stdout chunk (state=3): >>><<< 8605 1726776626.50865: done transferring module to remote 8605 1726776626.50880: _low_level_execute_command(): starting 8605 1726776626.50886: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/ /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_stat.py && sleep 0' 8605 1726776626.54034: stderr chunk (state=2): >>><<< 8605 1726776626.54042: stdout chunk (state=2): >>><<< 8605 1726776626.54056: _low_level_execute_command() done: rc=0, stdout=, stderr= 8605 1726776626.54060: _low_level_execute_command(): starting 8605 1726776626.54066: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_stat.py && sleep 0' 8605 1726776626.69027: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8605 1726776626.70024: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8605 1726776626.70068: stderr chunk (state=3): >>><<< 8605 1726776626.70075: stdout chunk (state=3): >>><<< 8605 1726776626.70090: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.14.221 closed. 8605 1726776626.70112: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8605 1726776626.70196: Sending initial data 8605 1726776626.70204: Sent initial data (158 bytes) 8605 1726776626.72712: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmptajamd_i/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source <<< 8605 1726776626.73027: stderr chunk (state=3): >>><<< 8605 1726776626.73035: stdout chunk (state=3): >>><<< 8605 1726776626.73050: _low_level_execute_command(): starting 8605 1726776626.73057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/ /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source && sleep 0' 8605 1726776626.75362: stderr chunk (state=2): >>><<< 8605 1726776626.75376: stdout chunk (state=2): >>><<< 8605 1726776626.75391: _low_level_execute_command() done: rc=0, stdout=, stderr= 8605 1726776626.75410: variable 'ansible_module_compression' from source: unknown 8605 1726776626.75447: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8605 1726776626.75468: variable 'ansible_facts' from source: unknown 8605 1726776626.75532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_copy.py 8605 1726776626.75620: Sending initial data 8605 1726776626.75627: Sent initial data (150 bytes) 8605 1726776626.78059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpeva9h17_ /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_copy.py <<< 8605 1726776626.79115: stderr chunk (state=3): >>><<< 8605 1726776626.79121: stdout chunk (state=3): >>><<< 8605 1726776626.79141: done transferring module to remote 8605 1726776626.79150: _low_level_execute_command(): starting 8605 1726776626.79155: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/ /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_copy.py && sleep 0' 8605 1726776626.81443: stderr chunk (state=2): >>><<< 8605 1726776626.81450: stdout chunk (state=2): >>><<< 8605 1726776626.81462: _low_level_execute_command() done: rc=0, stdout=, stderr= 8605 1726776626.81466: _low_level_execute_command(): starting 8605 1726776626.81471: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/AnsiballZ_copy.py && sleep 0' 8605 1726776626.97541: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8605 1726776626.98773: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8605 1726776626.98783: stdout chunk (state=3): >>><<< 8605 1726776626.98793: stderr chunk (state=3): >>><<< 8605 1726776626.98805: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8605 1726776626.98831: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8605 1726776626.98863: _low_level_execute_command(): starting 8605 1726776626.98872: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/ > /dev/null 2>&1 && sleep 0' 8605 1726776627.01535: stderr chunk (state=2): >>><<< 8605 1726776627.01544: stdout chunk (state=2): >>><<< 8605 1726776627.01559: _low_level_execute_command() done: rc=0, stdout=, stderr= 8605 1726776627.01571: handler run complete 8605 1726776627.01603: attempt loop complete, returning result 8605 1726776627.01608: _execute() done 8605 1726776627.01611: dumping result to json 8605 1726776627.01616: done dumping result, returning 8605 1726776627.01623: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [120fa90a-8a95-f1be-6eb1-00000000014f] 8605 1726776627.01631: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014f 8605 1726776627.01693: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000014f 8605 1726776627.01697: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726776626.347685-8605-109828471636964/source", "state": "file", "uid": 0 } 8186 1726776627.02214: no more pending results, returning what we have 8186 1726776627.02217: results queue empty 8186 1726776627.02218: checking for any_errors_fatal 8186 1726776627.02224: done checking for any_errors_fatal 8186 1726776627.02225: checking for max_fail_percentage 8186 1726776627.02226: done checking for max_fail_percentage 8186 1726776627.02227: checking to see if all hosts have failed and the running result is not ok 8186 1726776627.02228: done checking to see if all hosts have failed 8186 1726776627.02230: getting the remaining hosts for this loop 8186 1726776627.02231: done getting the remaining hosts for this loop 8186 1726776627.02235: getting the next task for host managed_node1 8186 1726776627.02241: done getting next task for host managed_node1 8186 1726776627.02245: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8186 1726776627.02248: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776627.02257: getting variables 8186 1726776627.02258: in VariableManager get_vars() 8186 1726776627.02289: Calling all_inventory to load vars for managed_node1 8186 1726776627.02291: Calling groups_inventory to load vars for managed_node1 8186 1726776627.02293: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776627.02302: Calling all_plugins_play to load vars for managed_node1 8186 1726776627.02304: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776627.02307: Calling groups_plugins_play to load vars for managed_node1 8186 1726776627.02467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776627.02704: done with get_vars() 8186 1726776627.02714: done getting variables 8186 1726776627.02772: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.719) 0:00:14.899 **** 8186 1726776627.02802: entering _queue_task() for managed_node1/service 8186 1726776627.03012: worker is 1 (out of 1 available) 8186 1726776627.03030: exiting _queue_task() for managed_node1/service 8186 1726776627.03045: done queuing things up, now waiting for results queue to drain 8186 1726776627.03047: waiting for pending results... 8625 1726776627.03278: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8625 1726776627.03424: in run() - task 120fa90a-8a95-f1be-6eb1-000000000150 8625 1726776627.03444: variable 'ansible_search_path' from source: unknown 8625 1726776627.03449: variable 'ansible_search_path' from source: unknown 8625 1726776627.03492: variable '__kernel_settings_services' from source: include_vars 8625 1726776627.03785: variable '__kernel_settings_services' from source: include_vars 8625 1726776627.03845: variable 'omit' from source: magic vars 8625 1726776627.03954: variable 'ansible_host' from source: host vars for 'managed_node1' 8625 1726776627.03967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8625 1726776627.03976: variable 'omit' from source: magic vars 8625 1726776627.04275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8625 1726776627.04503: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8625 1726776627.04546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8625 1726776627.04578: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8625 1726776627.04610: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8625 1726776627.04714: variable '__kernel_settings_register_profile' from source: set_fact 8625 1726776627.04726: variable '__kernel_settings_register_mode' from source: set_fact 8625 1726776627.04741: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 8625 1726776627.04748: variable 'omit' from source: magic vars 8625 1726776627.04795: variable 'omit' from source: magic vars 8625 1726776627.04842: variable 'item' from source: unknown 8625 1726776627.04902: variable 'item' from source: unknown 8625 1726776627.04922: variable 'omit' from source: magic vars 8625 1726776627.04951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8625 1726776627.05002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8625 1726776627.05021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8625 1726776627.05041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8625 1726776627.05053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8625 1726776627.05082: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8625 1726776627.05088: variable 'ansible_host' from source: host vars for 'managed_node1' 8625 1726776627.05092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8625 1726776627.05185: Set connection var ansible_shell_executable to /bin/sh 8625 1726776627.05194: Set connection var ansible_timeout to 10 8625 1726776627.05200: Set connection var ansible_module_compression to ZIP_DEFLATED 8625 1726776627.05202: Set connection var ansible_connection to ssh 8625 1726776627.05209: Set connection var ansible_pipelining to False 8625 1726776627.05214: Set connection var ansible_shell_type to sh 8625 1726776627.05283: variable 'ansible_shell_executable' from source: unknown 8625 1726776627.05289: variable 'ansible_connection' from source: unknown 8625 1726776627.05292: variable 'ansible_module_compression' from source: unknown 8625 1726776627.05295: variable 'ansible_shell_type' from source: unknown 8625 1726776627.05298: variable 'ansible_shell_executable' from source: unknown 8625 1726776627.05301: variable 'ansible_host' from source: host vars for 'managed_node1' 8625 1726776627.05304: variable 'ansible_pipelining' from source: unknown 8625 1726776627.05307: variable 'ansible_timeout' from source: unknown 8625 1726776627.05310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8625 1726776627.05398: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8625 1726776627.05409: variable 'omit' from source: magic vars 8625 1726776627.05415: starting attempt loop 8625 1726776627.05418: running the handler 8625 1726776627.05494: variable 'ansible_facts' from source: unknown 8625 1726776627.05604: _low_level_execute_command(): starting 8625 1726776627.05618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8625 1726776627.08018: stdout chunk (state=2): >>>/root <<< 8625 1726776627.08152: stderr chunk (state=3): >>><<< 8625 1726776627.08159: stdout chunk (state=3): >>><<< 8625 1726776627.08176: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8625 1726776627.08188: _low_level_execute_command(): starting 8625 1726776627.08194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289 `" && echo ansible-tmp-1726776627.0818381-8625-34669981874289="` echo /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289 `" ) && sleep 0' 8625 1726776627.10834: stdout chunk (state=2): >>>ansible-tmp-1726776627.0818381-8625-34669981874289=/root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289 <<< 8625 1726776627.10843: stderr chunk (state=2): >>><<< 8625 1726776627.10853: stdout chunk (state=3): >>><<< 8625 1726776627.10866: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776627.0818381-8625-34669981874289=/root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289 , stderr= 8625 1726776627.10897: variable 'ansible_module_compression' from source: unknown 8625 1726776627.10951: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8625 1726776627.11014: variable 'ansible_facts' from source: unknown 8625 1726776627.11250: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/AnsiballZ_systemd.py 8625 1726776627.11969: Sending initial data 8625 1726776627.11979: Sent initial data (153 bytes) 8625 1726776627.14630: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpmycdgo0d /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/AnsiballZ_systemd.py <<< 8625 1726776627.17442: stderr chunk (state=3): >>><<< 8625 1726776627.17451: stdout chunk (state=3): >>><<< 8625 1726776627.17475: done transferring module to remote 8625 1726776627.17487: _low_level_execute_command(): starting 8625 1726776627.17493: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/ /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/AnsiballZ_systemd.py && sleep 0' 8625 1726776627.20634: stderr chunk (state=2): >>><<< 8625 1726776627.20643: stdout chunk (state=2): >>><<< 8625 1726776627.20656: _low_level_execute_command() done: rc=0, stdout=, stderr= 8625 1726776627.20659: _low_level_execute_command(): starting 8625 1726776627.20662: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/AnsiballZ_systemd.py && sleep 0' 8625 1726776627.70926: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "676", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18608128", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": <<< 8625 1726776627.70947: stdout chunk (state=3): >>>"infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8625 1726776627.72654: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8625 1726776627.72696: stderr chunk (state=3): >>><<< 8625 1726776627.72704: stdout chunk (state=3): >>><<< 8625 1726776627.72723: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "676", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18608128", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8625 1726776627.72822: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8625 1726776627.72842: _low_level_execute_command(): starting 8625 1726776627.72849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776627.0818381-8625-34669981874289/ > /dev/null 2>&1 && sleep 0' 8625 1726776627.75333: stderr chunk (state=2): >>><<< 8625 1726776627.75341: stdout chunk (state=2): >>><<< 8625 1726776627.75355: _low_level_execute_command() done: rc=0, stdout=, stderr= 8625 1726776627.75362: handler run complete 8625 1726776627.75397: attempt loop complete, returning result 8625 1726776627.75414: variable 'item' from source: unknown 8625 1726776627.75478: variable 'item' from source: unknown changed: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:07:20 EDT", "ActiveEnterTimestampMonotonic": "24404616", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:07:19 EDT", "AssertTimestampMonotonic": "23355607", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ConditionTimestampMonotonic": "23355606", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "676", "ExecMainStartTimestamp": "Thu 2024-09-19 16:07:19 EDT", "ExecMainStartTimestampMonotonic": "23356599", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:07:19 EDT] ; stop_time=[n/a] ; pid=676 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 16:07:19 EDT", "InactiveExitTimestampMonotonic": "23356643", "InvocationID": "b85f8ab16fc34d90bf1e9620d92d7d18", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "676", "MemoryAccounting": "yes", "MemoryCurrent": "18608128", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:07:20 EDT", "StateChangeTimestampMonotonic": "24404616", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:07:20 EDT", "WatchdogTimestampMonotonic": "24404613", "WatchdogUSec": "0" } } 8625 1726776627.75585: dumping result to json 8625 1726776627.75603: done dumping result, returning 8625 1726776627.75612: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [120fa90a-8a95-f1be-6eb1-000000000150] 8625 1726776627.75618: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000150 8625 1726776627.75722: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000150 8625 1726776627.75727: WORKER PROCESS EXITING 8186 1726776627.76031: no more pending results, returning what we have 8186 1726776627.76034: results queue empty 8186 1726776627.76035: checking for any_errors_fatal 8186 1726776627.76042: done checking for any_errors_fatal 8186 1726776627.76042: checking for max_fail_percentage 8186 1726776627.76043: done checking for max_fail_percentage 8186 1726776627.76044: checking to see if all hosts have failed and the running result is not ok 8186 1726776627.76044: done checking to see if all hosts have failed 8186 1726776627.76045: getting the remaining hosts for this loop 8186 1726776627.76045: done getting the remaining hosts for this loop 8186 1726776627.76047: getting the next task for host managed_node1 8186 1726776627.76051: done getting next task for host managed_node1 8186 1726776627.76054: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8186 1726776627.76056: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776627.76064: getting variables 8186 1726776627.76065: in VariableManager get_vars() 8186 1726776627.76090: Calling all_inventory to load vars for managed_node1 8186 1726776627.76092: Calling groups_inventory to load vars for managed_node1 8186 1726776627.76093: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776627.76100: Calling all_plugins_play to load vars for managed_node1 8186 1726776627.76102: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776627.76104: Calling groups_plugins_play to load vars for managed_node1 8186 1726776627.76203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776627.76344: done with get_vars() 8186 1726776627.76352: done getting variables 8186 1726776627.76393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.736) 0:00:15.635 **** 8186 1726776627.76415: entering _queue_task() for managed_node1/command 8186 1726776627.76580: worker is 1 (out of 1 available) 8186 1726776627.76595: exiting _queue_task() for managed_node1/command 8186 1726776627.76607: done queuing things up, now waiting for results queue to drain 8186 1726776627.76608: waiting for pending results... 8655 1726776627.76730: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8655 1726776627.76834: in run() - task 120fa90a-8a95-f1be-6eb1-000000000151 8655 1726776627.76849: variable 'ansible_search_path' from source: unknown 8655 1726776627.76854: variable 'ansible_search_path' from source: unknown 8655 1726776627.76884: calling self._execute() 8655 1726776627.76945: variable 'ansible_host' from source: host vars for 'managed_node1' 8655 1726776627.76953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8655 1726776627.76962: variable 'omit' from source: magic vars 8655 1726776627.77287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8655 1726776627.77463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8655 1726776627.77498: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8655 1726776627.77525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8655 1726776627.77552: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8655 1726776627.77646: variable '__kernel_settings_register_profile' from source: set_fact 8655 1726776627.77664: Evaluated conditional (not __kernel_settings_register_profile is changed): False 8655 1726776627.77668: when evaluation is False, skipping this task 8655 1726776627.77672: _execute() done 8655 1726776627.77676: dumping result to json 8655 1726776627.77679: done dumping result, returning 8655 1726776627.77686: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [120fa90a-8a95-f1be-6eb1-000000000151] 8655 1726776627.77692: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000151 8655 1726776627.77715: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000151 8655 1726776627.77718: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 8186 1726776627.77866: no more pending results, returning what we have 8186 1726776627.77869: results queue empty 8186 1726776627.77869: checking for any_errors_fatal 8186 1726776627.77884: done checking for any_errors_fatal 8186 1726776627.77885: checking for max_fail_percentage 8186 1726776627.77886: done checking for max_fail_percentage 8186 1726776627.77886: checking to see if all hosts have failed and the running result is not ok 8186 1726776627.77887: done checking to see if all hosts have failed 8186 1726776627.77887: getting the remaining hosts for this loop 8186 1726776627.77888: done getting the remaining hosts for this loop 8186 1726776627.77890: getting the next task for host managed_node1 8186 1726776627.77894: done getting next task for host managed_node1 8186 1726776627.77897: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8186 1726776627.77899: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776627.77908: getting variables 8186 1726776627.77909: in VariableManager get_vars() 8186 1726776627.77933: Calling all_inventory to load vars for managed_node1 8186 1726776627.77935: Calling groups_inventory to load vars for managed_node1 8186 1726776627.77937: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776627.77942: Calling all_plugins_play to load vars for managed_node1 8186 1726776627.77944: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776627.77946: Calling groups_plugins_play to load vars for managed_node1 8186 1726776627.78046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776627.78164: done with get_vars() 8186 1726776627.78171: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.018) 0:00:15.654 **** 8186 1726776627.78235: entering _queue_task() for managed_node1/include_tasks 8186 1726776627.78388: worker is 1 (out of 1 available) 8186 1726776627.78403: exiting _queue_task() for managed_node1/include_tasks 8186 1726776627.78413: done queuing things up, now waiting for results queue to drain 8186 1726776627.78415: waiting for pending results... 8656 1726776627.78524: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8656 1726776627.78632: in run() - task 120fa90a-8a95-f1be-6eb1-000000000152 8656 1726776627.78647: variable 'ansible_search_path' from source: unknown 8656 1726776627.78652: variable 'ansible_search_path' from source: unknown 8656 1726776627.78677: calling self._execute() 8656 1726776627.78733: variable 'ansible_host' from source: host vars for 'managed_node1' 8656 1726776627.78742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8656 1726776627.78750: variable 'omit' from source: magic vars 8656 1726776627.79055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8656 1726776627.79221: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8656 1726776627.79299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8656 1726776627.79325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8656 1726776627.79354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8656 1726776627.79425: variable '__kernel_settings_register_apply' from source: set_fact 8656 1726776627.79450: Evaluated conditional (__kernel_settings_register_apply is changed): True 8656 1726776627.79457: _execute() done 8656 1726776627.79462: dumping result to json 8656 1726776627.79465: done dumping result, returning 8656 1726776627.79471: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [120fa90a-8a95-f1be-6eb1-000000000152] 8656 1726776627.79478: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000152 8656 1726776627.79494: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000152 8656 1726776627.79496: WORKER PROCESS EXITING 8186 1726776627.79695: no more pending results, returning what we have 8186 1726776627.79698: in VariableManager get_vars() 8186 1726776627.79725: Calling all_inventory to load vars for managed_node1 8186 1726776627.79727: Calling groups_inventory to load vars for managed_node1 8186 1726776627.79729: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776627.79738: Calling all_plugins_play to load vars for managed_node1 8186 1726776627.79740: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776627.79742: Calling groups_plugins_play to load vars for managed_node1 8186 1726776627.79842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776627.79990: done with get_vars() 8186 1726776627.79994: variable 'ansible_search_path' from source: unknown 8186 1726776627.79995: variable 'ansible_search_path' from source: unknown 8186 1726776627.80017: we have included files to process 8186 1726776627.80017: generating all_blocks data 8186 1726776627.80020: done generating all_blocks data 8186 1726776627.80024: processing included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8186 1726776627.80025: loading included file: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8186 1726776627.80026: Loading data from /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node1 8186 1726776627.80328: done processing included file 8186 1726776627.80332: iterating over new_blocks loaded from include file 8186 1726776627.80333: in VariableManager get_vars() 8186 1726776627.80347: done with get_vars() 8186 1726776627.80348: filtering new block on tags 8186 1726776627.80363: done filtering new block on tags 8186 1726776627.80365: done iterating over new_blocks loaded from include file 8186 1726776627.80365: extending task lists for all hosts with included blocks 8186 1726776627.80909: done extending task lists 8186 1726776627.80910: done processing included files 8186 1726776627.80910: results queue empty 8186 1726776627.80910: checking for any_errors_fatal 8186 1726776627.80913: done checking for any_errors_fatal 8186 1726776627.80913: checking for max_fail_percentage 8186 1726776627.80914: done checking for max_fail_percentage 8186 1726776627.80915: checking to see if all hosts have failed and the running result is not ok 8186 1726776627.80915: done checking to see if all hosts have failed 8186 1726776627.80916: getting the remaining hosts for this loop 8186 1726776627.80916: done getting the remaining hosts for this loop 8186 1726776627.80918: getting the next task for host managed_node1 8186 1726776627.80921: done getting next task for host managed_node1 8186 1726776627.80922: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8186 1726776627.80925: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776627.80933: getting variables 8186 1726776627.80934: in VariableManager get_vars() 8186 1726776627.80942: Calling all_inventory to load vars for managed_node1 8186 1726776627.80944: Calling groups_inventory to load vars for managed_node1 8186 1726776627.80945: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776627.80948: Calling all_plugins_play to load vars for managed_node1 8186 1726776627.80949: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776627.80951: Calling groups_plugins_play to load vars for managed_node1 8186 1726776627.81049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776627.81159: done with get_vars() 8186 1726776627.81165: done getting variables 8186 1726776627.81191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 16:10:27 -0400 (0:00:00.029) 0:00:15.683 **** 8186 1726776627.81214: entering _queue_task() for managed_node1/command 8186 1726776627.81374: worker is 1 (out of 1 available) 8186 1726776627.81389: exiting _queue_task() for managed_node1/command 8186 1726776627.81401: done queuing things up, now waiting for results queue to drain 8186 1726776627.81403: waiting for pending results... 8657 1726776627.81510: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8657 1726776627.81624: in run() - task 120fa90a-8a95-f1be-6eb1-000000000231 8657 1726776627.81643: variable 'ansible_search_path' from source: unknown 8657 1726776627.81647: variable 'ansible_search_path' from source: unknown 8657 1726776627.81673: calling self._execute() 8657 1726776627.81727: variable 'ansible_host' from source: host vars for 'managed_node1' 8657 1726776627.81739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8657 1726776627.81747: variable 'omit' from source: magic vars 8657 1726776627.81817: variable 'omit' from source: magic vars 8657 1726776627.81863: variable 'omit' from source: magic vars 8657 1726776627.81886: variable 'omit' from source: magic vars 8657 1726776627.81918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8657 1726776627.81946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8657 1726776627.81966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8657 1726776627.81982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8657 1726776627.81992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8657 1726776627.82018: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8657 1726776627.82023: variable 'ansible_host' from source: host vars for 'managed_node1' 8657 1726776627.82028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8657 1726776627.82099: Set connection var ansible_shell_executable to /bin/sh 8657 1726776627.82106: Set connection var ansible_timeout to 10 8657 1726776627.82112: Set connection var ansible_module_compression to ZIP_DEFLATED 8657 1726776627.82116: Set connection var ansible_connection to ssh 8657 1726776627.82123: Set connection var ansible_pipelining to False 8657 1726776627.82128: Set connection var ansible_shell_type to sh 8657 1726776627.82144: variable 'ansible_shell_executable' from source: unknown 8657 1726776627.82147: variable 'ansible_connection' from source: unknown 8657 1726776627.82150: variable 'ansible_module_compression' from source: unknown 8657 1726776627.82152: variable 'ansible_shell_type' from source: unknown 8657 1726776627.82153: variable 'ansible_shell_executable' from source: unknown 8657 1726776627.82155: variable 'ansible_host' from source: host vars for 'managed_node1' 8657 1726776627.82157: variable 'ansible_pipelining' from source: unknown 8657 1726776627.82159: variable 'ansible_timeout' from source: unknown 8657 1726776627.82161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8657 1726776627.82248: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8657 1726776627.82257: variable 'omit' from source: magic vars 8657 1726776627.82261: starting attempt loop 8657 1726776627.82263: running the handler 8657 1726776627.82273: _low_level_execute_command(): starting 8657 1726776627.82280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8657 1726776627.84634: stdout chunk (state=2): >>>/root <<< 8657 1726776627.84801: stderr chunk (state=3): >>><<< 8657 1726776627.84809: stdout chunk (state=3): >>><<< 8657 1726776627.84828: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8657 1726776627.84845: _low_level_execute_command(): starting 8657 1726776627.84852: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140 `" && echo ansible-tmp-1726776627.848389-8657-162263244187140="` echo /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140 `" ) && sleep 0' 8657 1726776627.87645: stdout chunk (state=2): >>>ansible-tmp-1726776627.848389-8657-162263244187140=/root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140 <<< 8657 1726776627.87713: stderr chunk (state=3): >>><<< 8657 1726776627.87721: stdout chunk (state=3): >>><<< 8657 1726776627.87743: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776627.848389-8657-162263244187140=/root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140 , stderr= 8657 1726776627.87774: variable 'ansible_module_compression' from source: unknown 8657 1726776627.87831: ANSIBALLZ: Using generic lock for ansible.legacy.command 8657 1726776627.87837: ANSIBALLZ: Acquiring lock 8657 1726776627.87840: ANSIBALLZ: Lock acquired: 140184657595568 8657 1726776627.87844: ANSIBALLZ: Creating module 8657 1726776628.00707: ANSIBALLZ: Writing module into payload 8657 1726776628.00788: ANSIBALLZ: Writing module 8657 1726776628.00807: ANSIBALLZ: Renaming module 8657 1726776628.00814: ANSIBALLZ: Done creating module 8657 1726776628.00831: variable 'ansible_facts' from source: unknown 8657 1726776628.00887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/AnsiballZ_command.py 8657 1726776628.00981: Sending initial data 8657 1726776628.00988: Sent initial data (153 bytes) 8657 1726776628.03535: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpzj65u0yz /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/AnsiballZ_command.py <<< 8657 1726776628.04580: stderr chunk (state=3): >>><<< 8657 1726776628.04589: stdout chunk (state=3): >>><<< 8657 1726776628.04607: done transferring module to remote 8657 1726776628.04617: _low_level_execute_command(): starting 8657 1726776628.04622: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/ /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/AnsiballZ_command.py && sleep 0' 8657 1726776628.07037: stderr chunk (state=2): >>><<< 8657 1726776628.07045: stdout chunk (state=2): >>><<< 8657 1726776628.07058: _low_level_execute_command() done: rc=0, stdout=, stderr= 8657 1726776628.07065: _low_level_execute_command(): starting 8657 1726776628.07070: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/AnsiballZ_command.py && sleep 0' 8657 1726776628.35841: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:28.220181", "end": "2024-09-19 16:10:28.343711", "delta": "0:00:00.123530", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8657 1726776628.36940: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8657 1726776628.36951: stdout chunk (state=3): >>><<< 8657 1726776628.36963: stderr chunk (state=3): >>><<< 8657 1726776628.36979: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 16:10:28.220181", "end": "2024-09-19 16:10:28.343711", "delta": "0:00:00.123530", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8657 1726776628.37020: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8657 1726776628.37033: _low_level_execute_command(): starting 8657 1726776628.37041: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776627.848389-8657-162263244187140/ > /dev/null 2>&1 && sleep 0' 8657 1726776628.42536: stderr chunk (state=2): >>><<< 8657 1726776628.42548: stdout chunk (state=2): >>><<< 8657 1726776628.42566: _low_level_execute_command() done: rc=0, stdout=, stderr= 8657 1726776628.42576: handler run complete 8657 1726776628.42601: Evaluated conditional (False): False 8657 1726776628.42612: attempt loop complete, returning result 8657 1726776628.42615: _execute() done 8657 1726776628.42618: dumping result to json 8657 1726776628.42623: done dumping result, returning 8657 1726776628.42632: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [120fa90a-8a95-f1be-6eb1-000000000231] 8657 1726776628.42638: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000231 8657 1726776628.42679: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000231 8657 1726776628.42683: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.123530", "end": "2024-09-19 16:10:28.343711", "rc": 0, "start": "2024-09-19 16:10:28.220181" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8186 1726776628.43072: no more pending results, returning what we have 8186 1726776628.43075: results queue empty 8186 1726776628.43076: checking for any_errors_fatal 8186 1726776628.43078: done checking for any_errors_fatal 8186 1726776628.43078: checking for max_fail_percentage 8186 1726776628.43080: done checking for max_fail_percentage 8186 1726776628.43080: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.43081: done checking to see if all hosts have failed 8186 1726776628.43082: getting the remaining hosts for this loop 8186 1726776628.43083: done getting the remaining hosts for this loop 8186 1726776628.43086: getting the next task for host managed_node1 8186 1726776628.43093: done getting next task for host managed_node1 8186 1726776628.43098: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8186 1726776628.43102: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.43113: getting variables 8186 1726776628.43115: in VariableManager get_vars() 8186 1726776628.43149: Calling all_inventory to load vars for managed_node1 8186 1726776628.43152: Calling groups_inventory to load vars for managed_node1 8186 1726776628.43154: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.43163: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.43165: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.43168: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.43319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.43521: done with get_vars() 8186 1726776628.43534: done getting variables 8186 1726776628.43621: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.624) 0:00:16.308 **** 8186 1726776628.43655: entering _queue_task() for managed_node1/shell 8186 1726776628.43657: Creating lock for shell 8186 1726776628.43873: worker is 1 (out of 1 available) 8186 1726776628.43886: exiting _queue_task() for managed_node1/shell 8186 1726776628.43898: done queuing things up, now waiting for results queue to drain 8186 1726776628.43900: waiting for pending results... 8693 1726776628.44343: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8693 1726776628.44504: in run() - task 120fa90a-8a95-f1be-6eb1-000000000232 8693 1726776628.44522: variable 'ansible_search_path' from source: unknown 8693 1726776628.44527: variable 'ansible_search_path' from source: unknown 8693 1726776628.44561: calling self._execute() 8693 1726776628.44635: variable 'ansible_host' from source: host vars for 'managed_node1' 8693 1726776628.44644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8693 1726776628.44653: variable 'omit' from source: magic vars 8693 1726776628.45453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8693 1726776628.45672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8693 1726776628.45713: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8693 1726776628.45747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8693 1726776628.45781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8693 1726776628.45881: variable '__kernel_settings_register_verify_values' from source: set_fact 8693 1726776628.45901: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8693 1726776628.45907: when evaluation is False, skipping this task 8693 1726776628.45911: _execute() done 8693 1726776628.45914: dumping result to json 8693 1726776628.45917: done dumping result, returning 8693 1726776628.45923: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [120fa90a-8a95-f1be-6eb1-000000000232] 8693 1726776628.45931: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000232 8693 1726776628.45958: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000232 8693 1726776628.45962: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8186 1726776628.46343: no more pending results, returning what we have 8186 1726776628.46346: results queue empty 8186 1726776628.46347: checking for any_errors_fatal 8186 1726776628.46353: done checking for any_errors_fatal 8186 1726776628.46354: checking for max_fail_percentage 8186 1726776628.46355: done checking for max_fail_percentage 8186 1726776628.46356: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.46357: done checking to see if all hosts have failed 8186 1726776628.46357: getting the remaining hosts for this loop 8186 1726776628.46358: done getting the remaining hosts for this loop 8186 1726776628.46361: getting the next task for host managed_node1 8186 1726776628.46367: done getting next task for host managed_node1 8186 1726776628.46370: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8186 1726776628.46376: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.46389: getting variables 8186 1726776628.46390: in VariableManager get_vars() 8186 1726776628.46420: Calling all_inventory to load vars for managed_node1 8186 1726776628.46423: Calling groups_inventory to load vars for managed_node1 8186 1726776628.46425: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.46434: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.46436: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.46439: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.46833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.47032: done with get_vars() 8186 1726776628.47040: done getting variables 8186 1726776628.47093: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.034) 0:00:16.343 **** 8186 1726776628.47123: entering _queue_task() for managed_node1/fail 8186 1726776628.47312: worker is 1 (out of 1 available) 8186 1726776628.47323: exiting _queue_task() for managed_node1/fail 8186 1726776628.47342: done queuing things up, now waiting for results queue to drain 8186 1726776628.47345: waiting for pending results... 8694 1726776628.47584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8694 1726776628.47742: in run() - task 120fa90a-8a95-f1be-6eb1-000000000233 8694 1726776628.47759: variable 'ansible_search_path' from source: unknown 8694 1726776628.47763: variable 'ansible_search_path' from source: unknown 8694 1726776628.47793: calling self._execute() 8694 1726776628.47867: variable 'ansible_host' from source: host vars for 'managed_node1' 8694 1726776628.47879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8694 1726776628.47888: variable 'omit' from source: magic vars 8694 1726776628.48318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8694 1726776628.48611: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8694 1726776628.48658: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8694 1726776628.48694: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8694 1726776628.48727: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8694 1726776628.48827: variable '__kernel_settings_register_verify_values' from source: set_fact 8694 1726776628.48849: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8694 1726776628.48854: when evaluation is False, skipping this task 8694 1726776628.48857: _execute() done 8694 1726776628.48860: dumping result to json 8694 1726776628.48863: done dumping result, returning 8694 1726776628.48868: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [120fa90a-8a95-f1be-6eb1-000000000233] 8694 1726776628.48876: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000233 8694 1726776628.48902: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000233 8694 1726776628.48905: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8186 1726776628.49239: no more pending results, returning what we have 8186 1726776628.49242: results queue empty 8186 1726776628.49242: checking for any_errors_fatal 8186 1726776628.49247: done checking for any_errors_fatal 8186 1726776628.49248: checking for max_fail_percentage 8186 1726776628.49249: done checking for max_fail_percentage 8186 1726776628.49250: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.49250: done checking to see if all hosts have failed 8186 1726776628.49251: getting the remaining hosts for this loop 8186 1726776628.49252: done getting the remaining hosts for this loop 8186 1726776628.49255: getting the next task for host managed_node1 8186 1726776628.49261: done getting next task for host managed_node1 8186 1726776628.49264: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8186 1726776628.49271: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.49286: getting variables 8186 1726776628.49288: in VariableManager get_vars() 8186 1726776628.49318: Calling all_inventory to load vars for managed_node1 8186 1726776628.49321: Calling groups_inventory to load vars for managed_node1 8186 1726776628.49323: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.49332: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.49335: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.49338: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.49506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.49731: done with get_vars() 8186 1726776628.49742: done getting variables 8186 1726776628.49786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.026) 0:00:16.369 **** 8186 1726776628.49810: entering _queue_task() for managed_node1/set_fact 8186 1726776628.49976: worker is 1 (out of 1 available) 8186 1726776628.49989: exiting _queue_task() for managed_node1/set_fact 8186 1726776628.50000: done queuing things up, now waiting for results queue to drain 8186 1726776628.50002: waiting for pending results... 8697 1726776628.50133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8697 1726776628.50263: in run() - task 120fa90a-8a95-f1be-6eb1-000000000153 8697 1726776628.50280: variable 'ansible_search_path' from source: unknown 8697 1726776628.50285: variable 'ansible_search_path' from source: unknown 8697 1726776628.50317: calling self._execute() 8697 1726776628.50396: variable 'ansible_host' from source: host vars for 'managed_node1' 8697 1726776628.50405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8697 1726776628.50414: variable 'omit' from source: magic vars 8697 1726776628.50507: variable 'omit' from source: magic vars 8697 1726776628.50557: variable 'omit' from source: magic vars 8697 1726776628.50586: variable 'omit' from source: magic vars 8697 1726776628.50621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8697 1726776628.50654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8697 1726776628.50674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8697 1726776628.50691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8697 1726776628.50703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8697 1726776628.50735: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8697 1726776628.50742: variable 'ansible_host' from source: host vars for 'managed_node1' 8697 1726776628.50746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8697 1726776628.50827: Set connection var ansible_shell_executable to /bin/sh 8697 1726776628.50837: Set connection var ansible_timeout to 10 8697 1726776628.50843: Set connection var ansible_module_compression to ZIP_DEFLATED 8697 1726776628.50847: Set connection var ansible_connection to ssh 8697 1726776628.50854: Set connection var ansible_pipelining to False 8697 1726776628.50860: Set connection var ansible_shell_type to sh 8697 1726776628.50879: variable 'ansible_shell_executable' from source: unknown 8697 1726776628.50883: variable 'ansible_connection' from source: unknown 8697 1726776628.50887: variable 'ansible_module_compression' from source: unknown 8697 1726776628.50890: variable 'ansible_shell_type' from source: unknown 8697 1726776628.50893: variable 'ansible_shell_executable' from source: unknown 8697 1726776628.50896: variable 'ansible_host' from source: host vars for 'managed_node1' 8697 1726776628.50900: variable 'ansible_pipelining' from source: unknown 8697 1726776628.50903: variable 'ansible_timeout' from source: unknown 8697 1726776628.50907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8697 1726776628.50997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8697 1726776628.51010: variable 'omit' from source: magic vars 8697 1726776628.51016: starting attempt loop 8697 1726776628.51020: running the handler 8697 1726776628.51039: handler run complete 8697 1726776628.51052: attempt loop complete, returning result 8697 1726776628.51058: _execute() done 8697 1726776628.51061: dumping result to json 8697 1726776628.51063: done dumping result, returning 8697 1726776628.51067: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [120fa90a-8a95-f1be-6eb1-000000000153] 8697 1726776628.51071: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000153 ok: [managed_node1] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8697 1726776628.51087: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000153 8697 1726776628.51092: WORKER PROCESS EXITING 8186 1726776628.51316: no more pending results, returning what we have 8186 1726776628.51319: results queue empty 8186 1726776628.51319: checking for any_errors_fatal 8186 1726776628.51323: done checking for any_errors_fatal 8186 1726776628.51323: checking for max_fail_percentage 8186 1726776628.51324: done checking for max_fail_percentage 8186 1726776628.51325: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.51325: done checking to see if all hosts have failed 8186 1726776628.51326: getting the remaining hosts for this loop 8186 1726776628.51326: done getting the remaining hosts for this loop 8186 1726776628.51330: getting the next task for host managed_node1 8186 1726776628.51335: done getting next task for host managed_node1 8186 1726776628.51337: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8186 1726776628.51340: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.51346: getting variables 8186 1726776628.51347: in VariableManager get_vars() 8186 1726776628.51370: Calling all_inventory to load vars for managed_node1 8186 1726776628.51372: Calling groups_inventory to load vars for managed_node1 8186 1726776628.51373: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.51380: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.51382: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.51384: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.51515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.51659: done with get_vars() 8186 1726776628.51666: done getting variables 8186 1726776628.51713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.019) 0:00:16.389 **** 8186 1726776628.51746: entering _queue_task() for managed_node1/set_fact 8186 1726776628.51921: worker is 1 (out of 1 available) 8186 1726776628.51935: exiting _queue_task() for managed_node1/set_fact 8186 1726776628.51946: done queuing things up, now waiting for results queue to drain 8186 1726776628.51948: waiting for pending results... 8699 1726776628.52154: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8699 1726776628.52290: in run() - task 120fa90a-8a95-f1be-6eb1-000000000154 8699 1726776628.52307: variable 'ansible_search_path' from source: unknown 8699 1726776628.52311: variable 'ansible_search_path' from source: unknown 8699 1726776628.52342: calling self._execute() 8699 1726776628.52412: variable 'ansible_host' from source: host vars for 'managed_node1' 8699 1726776628.52424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8699 1726776628.52434: variable 'omit' from source: magic vars 8699 1726776628.52532: variable 'omit' from source: magic vars 8699 1726776628.52583: variable 'omit' from source: magic vars 8699 1726776628.52931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8699 1726776628.53181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8699 1726776628.53220: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8699 1726776628.53253: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8699 1726776628.53285: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8699 1726776628.53430: variable '__kernel_settings_register_profile' from source: set_fact 8699 1726776628.53444: variable '__kernel_settings_register_mode' from source: set_fact 8699 1726776628.53453: variable '__kernel_settings_register_apply' from source: set_fact 8699 1726776628.53498: variable 'omit' from source: magic vars 8699 1726776628.53524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8699 1726776628.53579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8699 1726776628.53597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8699 1726776628.53612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8699 1726776628.53621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8699 1726776628.53648: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8699 1726776628.53653: variable 'ansible_host' from source: host vars for 'managed_node1' 8699 1726776628.53657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8699 1726776628.53753: Set connection var ansible_shell_executable to /bin/sh 8699 1726776628.53762: Set connection var ansible_timeout to 10 8699 1726776628.53768: Set connection var ansible_module_compression to ZIP_DEFLATED 8699 1726776628.53772: Set connection var ansible_connection to ssh 8699 1726776628.53783: Set connection var ansible_pipelining to False 8699 1726776628.53789: Set connection var ansible_shell_type to sh 8699 1726776628.53806: variable 'ansible_shell_executable' from source: unknown 8699 1726776628.53811: variable 'ansible_connection' from source: unknown 8699 1726776628.53814: variable 'ansible_module_compression' from source: unknown 8699 1726776628.53817: variable 'ansible_shell_type' from source: unknown 8699 1726776628.53820: variable 'ansible_shell_executable' from source: unknown 8699 1726776628.53823: variable 'ansible_host' from source: host vars for 'managed_node1' 8699 1726776628.53827: variable 'ansible_pipelining' from source: unknown 8699 1726776628.53832: variable 'ansible_timeout' from source: unknown 8699 1726776628.53836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8699 1726776628.53923: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8699 1726776628.53934: variable 'omit' from source: magic vars 8699 1726776628.53938: starting attempt loop 8699 1726776628.53941: running the handler 8699 1726776628.53947: handler run complete 8699 1726776628.53953: attempt loop complete, returning result 8699 1726776628.53955: _execute() done 8699 1726776628.53957: dumping result to json 8699 1726776628.53959: done dumping result, returning 8699 1726776628.53963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [120fa90a-8a95-f1be-6eb1-000000000154] 8699 1726776628.53966: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000154 8699 1726776628.53983: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000154 8699 1726776628.53985: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8186 1726776628.54498: no more pending results, returning what we have 8186 1726776628.54501: results queue empty 8186 1726776628.54501: checking for any_errors_fatal 8186 1726776628.54504: done checking for any_errors_fatal 8186 1726776628.54505: checking for max_fail_percentage 8186 1726776628.54506: done checking for max_fail_percentage 8186 1726776628.54506: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.54507: done checking to see if all hosts have failed 8186 1726776628.54507: getting the remaining hosts for this loop 8186 1726776628.54508: done getting the remaining hosts for this loop 8186 1726776628.54510: getting the next task for host managed_node1 8186 1726776628.54515: done getting next task for host managed_node1 8186 1726776628.54516: ^ task is: TASK: meta (role_complete) 8186 1726776628.54519: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.54525: getting variables 8186 1726776628.54526: in VariableManager get_vars() 8186 1726776628.54553: Calling all_inventory to load vars for managed_node1 8186 1726776628.54555: Calling groups_inventory to load vars for managed_node1 8186 1726776628.54556: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.54562: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.54563: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.54565: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.54679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.54798: done with get_vars() 8186 1726776628.54806: done getting variables 8186 1726776628.54858: done queuing things up, now waiting for results queue to drain 8186 1726776628.54863: results queue empty 8186 1726776628.54863: checking for any_errors_fatal 8186 1726776628.54866: done checking for any_errors_fatal 8186 1726776628.54866: checking for max_fail_percentage 8186 1726776628.54867: done checking for max_fail_percentage 8186 1726776628.54867: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.54868: done checking to see if all hosts have failed 8186 1726776628.54868: getting the remaining hosts for this loop 8186 1726776628.54868: done getting the remaining hosts for this loop 8186 1726776628.54870: getting the next task for host managed_node1 8186 1726776628.54873: done getting next task for host managed_node1 8186 1726776628.54875: ^ task is: TASK: Verify no settings 8186 1726776628.54876: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.54878: getting variables 8186 1726776628.54879: in VariableManager get_vars() 8186 1726776628.54886: Calling all_inventory to load vars for managed_node1 8186 1726776628.54887: Calling groups_inventory to load vars for managed_node1 8186 1726776628.54888: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.54892: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.54894: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.54895: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.54997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.55104: done with get_vars() 8186 1726776628.55111: done getting variables 8186 1726776628.55136: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.034) 0:00:16.423 **** 8186 1726776628.55155: entering _queue_task() for managed_node1/shell 8186 1726776628.55312: worker is 1 (out of 1 available) 8186 1726776628.55326: exiting _queue_task() for managed_node1/shell 8186 1726776628.55341: done queuing things up, now waiting for results queue to drain 8186 1726776628.55343: waiting for pending results... 8702 1726776628.55460: running TaskExecutor() for managed_node1/TASK: Verify no settings 8702 1726776628.55556: in run() - task 120fa90a-8a95-f1be-6eb1-000000000098 8702 1726776628.55572: variable 'ansible_search_path' from source: unknown 8702 1726776628.55578: variable 'ansible_search_path' from source: unknown 8702 1726776628.55604: calling self._execute() 8702 1726776628.55660: variable 'ansible_host' from source: host vars for 'managed_node1' 8702 1726776628.55667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8702 1726776628.55679: variable 'omit' from source: magic vars 8702 1726776628.55750: variable 'omit' from source: magic vars 8702 1726776628.55780: variable 'omit' from source: magic vars 8702 1726776628.56023: variable '__kernel_settings_profile_filename' from source: role '' exported vars 8702 1726776628.56080: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8702 1726776628.56142: variable '__kernel_settings_profile_parent' from source: set_fact 8702 1726776628.56151: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8702 1726776628.56183: variable 'omit' from source: magic vars 8702 1726776628.56214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8702 1726776628.56244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8702 1726776628.56262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8702 1726776628.56277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8702 1726776628.56288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8702 1726776628.56309: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8702 1726776628.56313: variable 'ansible_host' from source: host vars for 'managed_node1' 8702 1726776628.56315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8702 1726776628.56470: Set connection var ansible_shell_executable to /bin/sh 8702 1726776628.56481: Set connection var ansible_timeout to 10 8702 1726776628.56487: Set connection var ansible_module_compression to ZIP_DEFLATED 8702 1726776628.56490: Set connection var ansible_connection to ssh 8702 1726776628.56497: Set connection var ansible_pipelining to False 8702 1726776628.56502: Set connection var ansible_shell_type to sh 8702 1726776628.56518: variable 'ansible_shell_executable' from source: unknown 8702 1726776628.56522: variable 'ansible_connection' from source: unknown 8702 1726776628.56524: variable 'ansible_module_compression' from source: unknown 8702 1726776628.56526: variable 'ansible_shell_type' from source: unknown 8702 1726776628.56527: variable 'ansible_shell_executable' from source: unknown 8702 1726776628.56532: variable 'ansible_host' from source: host vars for 'managed_node1' 8702 1726776628.56536: variable 'ansible_pipelining' from source: unknown 8702 1726776628.56539: variable 'ansible_timeout' from source: unknown 8702 1726776628.56543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8702 1726776628.56633: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8702 1726776628.56644: variable 'omit' from source: magic vars 8702 1726776628.56649: starting attempt loop 8702 1726776628.56653: running the handler 8702 1726776628.56661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8702 1726776628.56677: _low_level_execute_command(): starting 8702 1726776628.56685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8702 1726776628.59171: stdout chunk (state=2): >>>/root <<< 8702 1726776628.59286: stderr chunk (state=3): >>><<< 8702 1726776628.59293: stdout chunk (state=3): >>><<< 8702 1726776628.59309: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8702 1726776628.59321: _low_level_execute_command(): starting 8702 1726776628.59327: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150 `" && echo ansible-tmp-1726776628.5931666-8702-41166703853150="` echo /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150 `" ) && sleep 0' 8702 1726776628.61769: stdout chunk (state=2): >>>ansible-tmp-1726776628.5931666-8702-41166703853150=/root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150 <<< 8702 1726776628.61894: stderr chunk (state=3): >>><<< 8702 1726776628.61901: stdout chunk (state=3): >>><<< 8702 1726776628.61915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776628.5931666-8702-41166703853150=/root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150 , stderr= 8702 1726776628.61943: variable 'ansible_module_compression' from source: unknown 8702 1726776628.61983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8702 1726776628.62011: variable 'ansible_facts' from source: unknown 8702 1726776628.62085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/AnsiballZ_command.py 8702 1726776628.62186: Sending initial data 8702 1726776628.62194: Sent initial data (153 bytes) 8702 1726776628.64679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpurvlzra4 /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/AnsiballZ_command.py <<< 8702 1726776628.65759: stderr chunk (state=3): >>><<< 8702 1726776628.65767: stdout chunk (state=3): >>><<< 8702 1726776628.65788: done transferring module to remote 8702 1726776628.65799: _low_level_execute_command(): starting 8702 1726776628.65805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/ /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/AnsiballZ_command.py && sleep 0' 8702 1726776628.68196: stderr chunk (state=2): >>><<< 8702 1726776628.68203: stdout chunk (state=2): >>><<< 8702 1726776628.68218: _low_level_execute_command() done: rc=0, stdout=, stderr= 8702 1726776628.68222: _low_level_execute_command(): starting 8702 1726776628.68227: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/AnsiballZ_command.py && sleep 0' 8702 1726776628.83917: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:10:28.830570", "end": "2024-09-19 16:10:28.838079", "delta": "0:00:00.007509", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8702 1726776628.85203: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8702 1726776628.85212: stdout chunk (state=3): >>><<< 8702 1726776628.85221: stderr chunk (state=3): >>><<< 8702 1726776628.85235: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 16:10:28.830570", "end": "2024-09-19 16:10:28.838079", "delta": "0:00:00.007509", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8702 1726776628.85277: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8702 1726776628.85288: _low_level_execute_command(): starting 8702 1726776628.85295: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776628.5931666-8702-41166703853150/ > /dev/null 2>&1 && sleep 0' 8702 1726776628.87815: stderr chunk (state=2): >>><<< 8702 1726776628.87823: stdout chunk (state=2): >>><<< 8702 1726776628.87841: _low_level_execute_command() done: rc=0, stdout=, stderr= 8702 1726776628.87848: handler run complete 8702 1726776628.87870: Evaluated conditional (False): False 8702 1726776628.87885: attempt loop complete, returning result 8702 1726776628.87890: _execute() done 8702 1726776628.87894: dumping result to json 8702 1726776628.87899: done dumping result, returning 8702 1726776628.87907: done running TaskExecutor() for managed_node1/TASK: Verify no settings [120fa90a-8a95-f1be-6eb1-000000000098] 8702 1726776628.87914: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000098 8702 1726776628.87955: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000098 8702 1726776628.87959: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007509", "end": "2024-09-19 16:10:28.838079", "rc": 0, "start": "2024-09-19 16:10:28.830570" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8186 1726776628.88190: no more pending results, returning what we have 8186 1726776628.88193: results queue empty 8186 1726776628.88194: checking for any_errors_fatal 8186 1726776628.88196: done checking for any_errors_fatal 8186 1726776628.88196: checking for max_fail_percentage 8186 1726776628.88198: done checking for max_fail_percentage 8186 1726776628.88198: checking to see if all hosts have failed and the running result is not ok 8186 1726776628.88199: done checking to see if all hosts have failed 8186 1726776628.88199: getting the remaining hosts for this loop 8186 1726776628.88201: done getting the remaining hosts for this loop 8186 1726776628.88204: getting the next task for host managed_node1 8186 1726776628.88211: done getting next task for host managed_node1 8186 1726776628.88213: ^ task is: TASK: Remove kernel_settings tuned profile 8186 1726776628.88216: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776628.88218: getting variables 8186 1726776628.88219: in VariableManager get_vars() 8186 1726776628.88256: Calling all_inventory to load vars for managed_node1 8186 1726776628.88260: Calling groups_inventory to load vars for managed_node1 8186 1726776628.88262: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776628.88272: Calling all_plugins_play to load vars for managed_node1 8186 1726776628.88278: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776628.88280: Calling groups_plugins_play to load vars for managed_node1 8186 1726776628.88432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776628.88616: done with get_vars() 8186 1726776628.88630: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 16:10:28 -0400 (0:00:00.335) 0:00:16.758 **** 8186 1726776628.88711: entering _queue_task() for managed_node1/file 8186 1726776628.88912: worker is 1 (out of 1 available) 8186 1726776628.88925: exiting _queue_task() for managed_node1/file 8186 1726776628.88937: done queuing things up, now waiting for results queue to drain 8186 1726776628.88940: waiting for pending results... 8724 1726776628.89238: running TaskExecutor() for managed_node1/TASK: Remove kernel_settings tuned profile 8724 1726776628.89358: in run() - task 120fa90a-8a95-f1be-6eb1-000000000099 8724 1726776628.89375: variable 'ansible_search_path' from source: unknown 8724 1726776628.89380: variable 'ansible_search_path' from source: unknown 8724 1726776628.89412: calling self._execute() 8724 1726776628.89472: variable 'ansible_host' from source: host vars for 'managed_node1' 8724 1726776628.89482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8724 1726776628.89490: variable 'omit' from source: magic vars 8724 1726776628.89574: variable 'omit' from source: magic vars 8724 1726776628.89611: variable 'omit' from source: magic vars 8724 1726776628.89639: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8724 1726776628.89886: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8724 1726776628.89952: variable '__kernel_settings_profile_parent' from source: set_fact 8724 1726776628.89962: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8724 1726776628.89994: variable 'omit' from source: magic vars 8724 1726776628.90047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8724 1726776628.90147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8724 1726776628.90168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8724 1726776628.90185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8724 1726776628.90197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8724 1726776628.90219: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8724 1726776628.90222: variable 'ansible_host' from source: host vars for 'managed_node1' 8724 1726776628.90225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8724 1726776628.90322: Set connection var ansible_shell_executable to /bin/sh 8724 1726776628.90334: Set connection var ansible_timeout to 10 8724 1726776628.90340: Set connection var ansible_module_compression to ZIP_DEFLATED 8724 1726776628.90344: Set connection var ansible_connection to ssh 8724 1726776628.90351: Set connection var ansible_pipelining to False 8724 1726776628.90356: Set connection var ansible_shell_type to sh 8724 1726776628.90376: variable 'ansible_shell_executable' from source: unknown 8724 1726776628.90381: variable 'ansible_connection' from source: unknown 8724 1726776628.90384: variable 'ansible_module_compression' from source: unknown 8724 1726776628.90387: variable 'ansible_shell_type' from source: unknown 8724 1726776628.90389: variable 'ansible_shell_executable' from source: unknown 8724 1726776628.90392: variable 'ansible_host' from source: host vars for 'managed_node1' 8724 1726776628.90395: variable 'ansible_pipelining' from source: unknown 8724 1726776628.90398: variable 'ansible_timeout' from source: unknown 8724 1726776628.90401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8724 1726776628.90570: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8724 1726776628.90581: variable 'omit' from source: magic vars 8724 1726776628.90586: starting attempt loop 8724 1726776628.90589: running the handler 8724 1726776628.90600: _low_level_execute_command(): starting 8724 1726776628.90608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8724 1726776628.93064: stdout chunk (state=2): >>>/root <<< 8724 1726776628.93197: stderr chunk (state=3): >>><<< 8724 1726776628.93205: stdout chunk (state=3): >>><<< 8724 1726776628.93231: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8724 1726776628.93245: _low_level_execute_command(): starting 8724 1726776628.93252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398 `" && echo ansible-tmp-1726776628.9324036-8724-250479262352398="` echo /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398 `" ) && sleep 0' 8724 1726776628.95654: stdout chunk (state=2): >>>ansible-tmp-1726776628.9324036-8724-250479262352398=/root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398 <<< 8724 1726776628.95782: stderr chunk (state=3): >>><<< 8724 1726776628.95791: stdout chunk (state=3): >>><<< 8724 1726776628.95803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776628.9324036-8724-250479262352398=/root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398 , stderr= 8724 1726776628.95839: variable 'ansible_module_compression' from source: unknown 8724 1726776628.95881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 8724 1726776628.95913: variable 'ansible_facts' from source: unknown 8724 1726776628.95987: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/AnsiballZ_file.py 8724 1726776628.96087: Sending initial data 8724 1726776628.96095: Sent initial data (151 bytes) 8724 1726776628.98497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpmk4rb048 /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/AnsiballZ_file.py <<< 8724 1726776628.99556: stderr chunk (state=3): >>><<< 8724 1726776628.99563: stdout chunk (state=3): >>><<< 8724 1726776628.99582: done transferring module to remote 8724 1726776628.99591: _low_level_execute_command(): starting 8724 1726776628.99597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/ /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/AnsiballZ_file.py && sleep 0' 8724 1726776629.01860: stderr chunk (state=2): >>><<< 8724 1726776629.01866: stdout chunk (state=2): >>><<< 8724 1726776629.01882: _low_level_execute_command() done: rc=0, stdout=, stderr= 8724 1726776629.01886: _low_level_execute_command(): starting 8724 1726776629.01891: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/AnsiballZ_file.py && sleep 0' 8724 1726776629.17444: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8724 1726776629.18158: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8724 1726776629.18201: stderr chunk (state=3): >>><<< 8724 1726776629.18207: stdout chunk (state=3): >>><<< 8724 1726776629.18221: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8724 1726776629.18252: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8724 1726776629.18260: _low_level_execute_command(): starting 8724 1726776629.18264: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776628.9324036-8724-250479262352398/ > /dev/null 2>&1 && sleep 0' 8724 1726776629.20834: stderr chunk (state=2): >>><<< 8724 1726776629.20843: stdout chunk (state=2): >>><<< 8724 1726776629.20857: _low_level_execute_command() done: rc=0, stdout=, stderr= 8724 1726776629.20863: handler run complete 8724 1726776629.20892: attempt loop complete, returning result 8724 1726776629.20899: _execute() done 8724 1726776629.20903: dumping result to json 8724 1726776629.20908: done dumping result, returning 8724 1726776629.20915: done running TaskExecutor() for managed_node1/TASK: Remove kernel_settings tuned profile [120fa90a-8a95-f1be-6eb1-000000000099] 8724 1726776629.20921: sending task result for task 120fa90a-8a95-f1be-6eb1-000000000099 8724 1726776629.20960: done sending task result for task 120fa90a-8a95-f1be-6eb1-000000000099 8724 1726776629.20964: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8186 1726776629.21264: no more pending results, returning what we have 8186 1726776629.21266: results queue empty 8186 1726776629.21267: checking for any_errors_fatal 8186 1726776629.21273: done checking for any_errors_fatal 8186 1726776629.21274: checking for max_fail_percentage 8186 1726776629.21275: done checking for max_fail_percentage 8186 1726776629.21276: checking to see if all hosts have failed and the running result is not ok 8186 1726776629.21276: done checking to see if all hosts have failed 8186 1726776629.21277: getting the remaining hosts for this loop 8186 1726776629.21278: done getting the remaining hosts for this loop 8186 1726776629.21280: getting the next task for host managed_node1 8186 1726776629.21284: done getting next task for host managed_node1 8186 1726776629.21285: ^ task is: TASK: Get active_profile 8186 1726776629.21288: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776629.21293: getting variables 8186 1726776629.21294: in VariableManager get_vars() 8186 1726776629.21323: Calling all_inventory to load vars for managed_node1 8186 1726776629.21326: Calling groups_inventory to load vars for managed_node1 8186 1726776629.21327: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776629.21337: Calling all_plugins_play to load vars for managed_node1 8186 1726776629.21344: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776629.21347: Calling groups_plugins_play to load vars for managed_node1 8186 1726776629.21563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776629.21699: done with get_vars() 8186 1726776629.21707: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.330) 0:00:17.089 **** 8186 1726776629.21778: entering _queue_task() for managed_node1/slurp 8186 1726776629.21971: worker is 1 (out of 1 available) 8186 1726776629.21984: exiting _queue_task() for managed_node1/slurp 8186 1726776629.21996: done queuing things up, now waiting for results queue to drain 8186 1726776629.21998: waiting for pending results... 8754 1726776629.22203: running TaskExecutor() for managed_node1/TASK: Get active_profile 8754 1726776629.22321: in run() - task 120fa90a-8a95-f1be-6eb1-00000000009a 8754 1726776629.22343: variable 'ansible_search_path' from source: unknown 8754 1726776629.22350: variable 'ansible_search_path' from source: unknown 8754 1726776629.22375: calling self._execute() 8754 1726776629.22447: variable 'ansible_host' from source: host vars for 'managed_node1' 8754 1726776629.22453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8754 1726776629.22459: variable 'omit' from source: magic vars 8754 1726776629.22535: variable 'omit' from source: magic vars 8754 1726776629.22564: variable 'omit' from source: magic vars 8754 1726776629.22582: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8754 1726776629.22807: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8754 1726776629.22867: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8754 1726776629.22894: variable 'omit' from source: magic vars 8754 1726776629.22926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8754 1726776629.22955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8754 1726776629.22972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8754 1726776629.22985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8754 1726776629.22994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8754 1726776629.23017: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8754 1726776629.23021: variable 'ansible_host' from source: host vars for 'managed_node1' 8754 1726776629.23024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8754 1726776629.23101: Set connection var ansible_shell_executable to /bin/sh 8754 1726776629.23109: Set connection var ansible_timeout to 10 8754 1726776629.23115: Set connection var ansible_module_compression to ZIP_DEFLATED 8754 1726776629.23118: Set connection var ansible_connection to ssh 8754 1726776629.23124: Set connection var ansible_pipelining to False 8754 1726776629.23127: Set connection var ansible_shell_type to sh 8754 1726776629.23142: variable 'ansible_shell_executable' from source: unknown 8754 1726776629.23144: variable 'ansible_connection' from source: unknown 8754 1726776629.23146: variable 'ansible_module_compression' from source: unknown 8754 1726776629.23148: variable 'ansible_shell_type' from source: unknown 8754 1726776629.23150: variable 'ansible_shell_executable' from source: unknown 8754 1726776629.23151: variable 'ansible_host' from source: host vars for 'managed_node1' 8754 1726776629.23153: variable 'ansible_pipelining' from source: unknown 8754 1726776629.23155: variable 'ansible_timeout' from source: unknown 8754 1726776629.23157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8754 1726776629.23297: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 8754 1726776629.23307: variable 'omit' from source: magic vars 8754 1726776629.23312: starting attempt loop 8754 1726776629.23316: running the handler 8754 1726776629.23326: _low_level_execute_command(): starting 8754 1726776629.23337: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8754 1726776629.25688: stdout chunk (state=2): >>>/root <<< 8754 1726776629.25808: stderr chunk (state=3): >>><<< 8754 1726776629.25814: stdout chunk (state=3): >>><<< 8754 1726776629.25830: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8754 1726776629.25840: _low_level_execute_command(): starting 8754 1726776629.25844: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636 `" && echo ansible-tmp-1726776629.2583573-8754-268101759512636="` echo /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636 `" ) && sleep 0' 8754 1726776629.28222: stdout chunk (state=2): >>>ansible-tmp-1726776629.2583573-8754-268101759512636=/root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636 <<< 8754 1726776629.28354: stderr chunk (state=3): >>><<< 8754 1726776629.28360: stdout chunk (state=3): >>><<< 8754 1726776629.28374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.2583573-8754-268101759512636=/root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636 , stderr= 8754 1726776629.28413: variable 'ansible_module_compression' from source: unknown 8754 1726776629.28452: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 8754 1726776629.28483: variable 'ansible_facts' from source: unknown 8754 1726776629.28552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/AnsiballZ_slurp.py 8754 1726776629.28652: Sending initial data 8754 1726776629.28659: Sent initial data (152 bytes) 8754 1726776629.31104: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpa89z89tc /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/AnsiballZ_slurp.py <<< 8754 1726776629.32121: stderr chunk (state=3): >>><<< 8754 1726776629.32127: stdout chunk (state=3): >>><<< 8754 1726776629.32147: done transferring module to remote 8754 1726776629.32159: _low_level_execute_command(): starting 8754 1726776629.32165: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/ /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/AnsiballZ_slurp.py && sleep 0' 8754 1726776629.34454: stderr chunk (state=2): >>><<< 8754 1726776629.34462: stdout chunk (state=2): >>><<< 8754 1726776629.34476: _low_level_execute_command() done: rc=0, stdout=, stderr= 8754 1726776629.34481: _low_level_execute_command(): starting 8754 1726776629.34485: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/AnsiballZ_slurp.py && sleep 0' 8754 1726776629.49036: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8754 1726776629.50095: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8754 1726776629.50105: stdout chunk (state=3): >>><<< 8754 1726776629.50116: stderr chunk (state=3): >>><<< 8754 1726776629.50132: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.14.221 closed. 8754 1726776629.50160: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8754 1726776629.50172: _low_level_execute_command(): starting 8754 1726776629.50178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.2583573-8754-268101759512636/ > /dev/null 2>&1 && sleep 0' 8754 1726776629.53570: stderr chunk (state=2): >>><<< 8754 1726776629.53582: stdout chunk (state=2): >>><<< 8754 1726776629.53597: _low_level_execute_command() done: rc=0, stdout=, stderr= 8754 1726776629.53604: handler run complete 8754 1726776629.53619: attempt loop complete, returning result 8754 1726776629.53624: _execute() done 8754 1726776629.53627: dumping result to json 8754 1726776629.53632: done dumping result, returning 8754 1726776629.53639: done running TaskExecutor() for managed_node1/TASK: Get active_profile [120fa90a-8a95-f1be-6eb1-00000000009a] 8754 1726776629.53645: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009a 8754 1726776629.53683: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009a 8754 1726776629.53687: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8186 1726776629.54188: no more pending results, returning what we have 8186 1726776629.54191: results queue empty 8186 1726776629.54192: checking for any_errors_fatal 8186 1726776629.54197: done checking for any_errors_fatal 8186 1726776629.54198: checking for max_fail_percentage 8186 1726776629.54199: done checking for max_fail_percentage 8186 1726776629.54200: checking to see if all hosts have failed and the running result is not ok 8186 1726776629.54201: done checking to see if all hosts have failed 8186 1726776629.54201: getting the remaining hosts for this loop 8186 1726776629.54202: done getting the remaining hosts for this loop 8186 1726776629.54206: getting the next task for host managed_node1 8186 1726776629.54212: done getting next task for host managed_node1 8186 1726776629.54214: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8186 1726776629.54217: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776629.54220: getting variables 8186 1726776629.54221: in VariableManager get_vars() 8186 1726776629.54256: Calling all_inventory to load vars for managed_node1 8186 1726776629.54259: Calling groups_inventory to load vars for managed_node1 8186 1726776629.54261: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776629.54270: Calling all_plugins_play to load vars for managed_node1 8186 1726776629.54272: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776629.54275: Calling groups_plugins_play to load vars for managed_node1 8186 1726776629.54439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776629.54640: done with get_vars() 8186 1726776629.54649: done getting variables 8186 1726776629.54692: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 16:10:29 -0400 (0:00:00.329) 0:00:17.418 **** 8186 1726776629.54712: entering _queue_task() for managed_node1/copy 8186 1726776629.54905: worker is 1 (out of 1 available) 8186 1726776629.54919: exiting _queue_task() for managed_node1/copy 8186 1726776629.54934: done queuing things up, now waiting for results queue to drain 8186 1726776629.54937: waiting for pending results... 8774 1726776629.55054: running TaskExecutor() for managed_node1/TASK: Ensure kernel_settings is not in active_profile 8774 1726776629.55157: in run() - task 120fa90a-8a95-f1be-6eb1-00000000009b 8774 1726776629.55174: variable 'ansible_search_path' from source: unknown 8774 1726776629.55181: variable 'ansible_search_path' from source: unknown 8774 1726776629.55208: calling self._execute() 8774 1726776629.55272: variable 'ansible_host' from source: host vars for 'managed_node1' 8774 1726776629.55282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8774 1726776629.55291: variable 'omit' from source: magic vars 8774 1726776629.55365: variable 'omit' from source: magic vars 8774 1726776629.55395: variable 'omit' from source: magic vars 8774 1726776629.55413: variable '__active_profile' from source: task vars 8774 1726776629.55633: variable '__active_profile' from source: task vars 8774 1726776629.55780: variable '__cur_profile' from source: task vars 8774 1726776629.55886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8774 1726776629.58186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8774 1726776629.58253: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8774 1726776629.58288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8774 1726776629.58315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8774 1726776629.58347: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8774 1726776629.58401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8774 1726776629.58432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8774 1726776629.58451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8774 1726776629.58474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8774 1726776629.58486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8774 1726776629.58561: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8774 1726776629.58600: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8774 1726776629.58650: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8774 1726776629.58702: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8774 1726776629.58719: variable 'omit' from source: magic vars 8774 1726776629.58739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8774 1726776629.58758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8774 1726776629.58773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8774 1726776629.58787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8774 1726776629.58795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8774 1726776629.58815: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8774 1726776629.58819: variable 'ansible_host' from source: host vars for 'managed_node1' 8774 1726776629.58821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8774 1726776629.58909: Set connection var ansible_shell_executable to /bin/sh 8774 1726776629.58917: Set connection var ansible_timeout to 10 8774 1726776629.58923: Set connection var ansible_module_compression to ZIP_DEFLATED 8774 1726776629.58927: Set connection var ansible_connection to ssh 8774 1726776629.58939: Set connection var ansible_pipelining to False 8774 1726776629.58944: Set connection var ansible_shell_type to sh 8774 1726776629.58964: variable 'ansible_shell_executable' from source: unknown 8774 1726776629.58969: variable 'ansible_connection' from source: unknown 8774 1726776629.58971: variable 'ansible_module_compression' from source: unknown 8774 1726776629.58974: variable 'ansible_shell_type' from source: unknown 8774 1726776629.58979: variable 'ansible_shell_executable' from source: unknown 8774 1726776629.58982: variable 'ansible_host' from source: host vars for 'managed_node1' 8774 1726776629.58985: variable 'ansible_pipelining' from source: unknown 8774 1726776629.58987: variable 'ansible_timeout' from source: unknown 8774 1726776629.58991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8774 1726776629.59074: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8774 1726776629.59087: variable 'omit' from source: magic vars 8774 1726776629.59093: starting attempt loop 8774 1726776629.59096: running the handler 8774 1726776629.59106: _low_level_execute_command(): starting 8774 1726776629.59114: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8774 1726776629.61560: stdout chunk (state=2): >>>/root <<< 8774 1726776629.61709: stderr chunk (state=3): >>><<< 8774 1726776629.61715: stdout chunk (state=3): >>><<< 8774 1726776629.61735: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8774 1726776629.61747: _low_level_execute_command(): starting 8774 1726776629.61752: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251 `" && echo ansible-tmp-1726776629.6174188-8774-160362776792251="` echo /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251 `" ) && sleep 0' 8774 1726776629.64788: stdout chunk (state=2): >>>ansible-tmp-1726776629.6174188-8774-160362776792251=/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251 <<< 8774 1726776629.64920: stderr chunk (state=3): >>><<< 8774 1726776629.64926: stdout chunk (state=3): >>><<< 8774 1726776629.64944: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776629.6174188-8774-160362776792251=/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251 , stderr= 8774 1726776629.65033: variable 'ansible_module_compression' from source: unknown 8774 1726776629.65082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8774 1726776629.65114: variable 'ansible_facts' from source: unknown 8774 1726776629.65213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_stat.py 8774 1726776629.65637: Sending initial data 8774 1726776629.65643: Sent initial data (151 bytes) 8774 1726776629.68582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpuqx_zqsg /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_stat.py <<< 8774 1726776629.69709: stderr chunk (state=3): >>><<< 8774 1726776629.69716: stdout chunk (state=3): >>><<< 8774 1726776629.69738: done transferring module to remote 8774 1726776629.69750: _low_level_execute_command(): starting 8774 1726776629.69756: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/ /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_stat.py && sleep 0' 8774 1726776629.72222: stderr chunk (state=2): >>><<< 8774 1726776629.72233: stdout chunk (state=2): >>><<< 8774 1726776629.72247: _low_level_execute_command() done: rc=0, stdout=, stderr= 8774 1726776629.72251: _low_level_execute_command(): starting 8774 1726776629.72256: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_stat.py && sleep 0' 8774 1726776629.88321: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 494928066, "dev": 51713, "nlink": 1, "atime": 1726776629.4897642, "mtime": 1726776627.6867535, "ctime": 1726776627.6867535, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1927871963", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8774 1726776629.89708: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8774 1726776629.89719: stdout chunk (state=3): >>><<< 8774 1726776629.89734: stderr chunk (state=3): >>><<< 8774 1726776629.89750: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 494928066, "dev": 51713, "nlink": 1, "atime": 1726776629.4897642, "mtime": 1726776627.6867535, "ctime": 1726776627.6867535, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "1927871963", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.14.221 closed. 8774 1726776629.89806: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8774 1726776629.90267: Sending initial data 8774 1726776629.90274: Sent initial data (140 bytes) 8774 1726776629.93035: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpb0ebi07x /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source <<< 8774 1726776629.94035: stderr chunk (state=3): >>><<< 8774 1726776629.94044: stdout chunk (state=3): >>><<< 8774 1726776629.94068: _low_level_execute_command(): starting 8774 1726776629.94075: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/ /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source && sleep 0' 8774 1726776629.96682: stderr chunk (state=2): >>><<< 8774 1726776629.96692: stdout chunk (state=2): >>><<< 8774 1726776629.96709: _low_level_execute_command() done: rc=0, stdout=, stderr= 8774 1726776629.96732: variable 'ansible_module_compression' from source: unknown 8774 1726776629.96766: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8774 1726776629.96786: variable 'ansible_facts' from source: unknown 8774 1726776629.96846: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_copy.py 8774 1726776629.96936: Sending initial data 8774 1726776629.96943: Sent initial data (151 bytes) 8774 1726776629.99531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpltvxrqgi /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_copy.py <<< 8774 1726776630.00976: stderr chunk (state=3): >>><<< 8774 1726776630.00988: stdout chunk (state=3): >>><<< 8774 1726776630.01009: done transferring module to remote 8774 1726776630.01019: _low_level_execute_command(): starting 8774 1726776630.01024: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/ /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_copy.py && sleep 0' 8774 1726776630.03506: stderr chunk (state=2): >>><<< 8774 1726776630.03518: stdout chunk (state=2): >>><<< 8774 1726776630.03535: _low_level_execute_command() done: rc=0, stdout=, stderr= 8774 1726776630.03540: _low_level_execute_command(): starting 8774 1726776630.03545: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/AnsiballZ_copy.py && sleep 0' 8774 1726776630.19941: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source", "_original_basename": "tmpb0ebi07x", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8774 1726776630.21052: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8774 1726776630.21100: stderr chunk (state=3): >>><<< 8774 1726776630.21108: stdout chunk (state=3): >>><<< 8774 1726776630.21123: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source", "_original_basename": "tmpb0ebi07x", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8774 1726776630.21149: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source', '_original_basename': 'tmpb0ebi07x', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8774 1726776630.21159: _low_level_execute_command(): starting 8774 1726776630.21165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/ > /dev/null 2>&1 && sleep 0' 8774 1726776630.23552: stderr chunk (state=2): >>><<< 8774 1726776630.23560: stdout chunk (state=2): >>><<< 8774 1726776630.23576: _low_level_execute_command() done: rc=0, stdout=, stderr= 8774 1726776630.23585: handler run complete 8774 1726776630.23603: attempt loop complete, returning result 8774 1726776630.23606: _execute() done 8774 1726776630.23610: dumping result to json 8774 1726776630.23616: done dumping result, returning 8774 1726776630.23623: done running TaskExecutor() for managed_node1/TASK: Ensure kernel_settings is not in active_profile [120fa90a-8a95-f1be-6eb1-00000000009b] 8774 1726776630.23630: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009b 8774 1726776630.23658: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009b 8774 1726776630.23661: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726776629.6174188-8774-160362776792251/source", "state": "file", "uid": 0 } 8186 1726776630.23806: no more pending results, returning what we have 8186 1726776630.23809: results queue empty 8186 1726776630.23810: checking for any_errors_fatal 8186 1726776630.23817: done checking for any_errors_fatal 8186 1726776630.23818: checking for max_fail_percentage 8186 1726776630.23819: done checking for max_fail_percentage 8186 1726776630.23820: checking to see if all hosts have failed and the running result is not ok 8186 1726776630.23820: done checking to see if all hosts have failed 8186 1726776630.23821: getting the remaining hosts for this loop 8186 1726776630.23822: done getting the remaining hosts for this loop 8186 1726776630.23825: getting the next task for host managed_node1 8186 1726776630.23832: done getting next task for host managed_node1 8186 1726776630.23834: ^ task is: TASK: Set profile_mode to auto 8186 1726776630.23837: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776630.23840: getting variables 8186 1726776630.23841: in VariableManager get_vars() 8186 1726776630.23873: Calling all_inventory to load vars for managed_node1 8186 1726776630.23878: Calling groups_inventory to load vars for managed_node1 8186 1726776630.23880: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776630.23889: Calling all_plugins_play to load vars for managed_node1 8186 1726776630.23891: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776630.23893: Calling groups_plugins_play to load vars for managed_node1 8186 1726776630.24047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776630.24161: done with get_vars() 8186 1726776630.24169: done getting variables 8186 1726776630.24210: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.695) 0:00:18.114 **** 8186 1726776630.24233: entering _queue_task() for managed_node1/copy 8186 1726776630.24402: worker is 1 (out of 1 available) 8186 1726776630.24418: exiting _queue_task() for managed_node1/copy 8186 1726776630.24431: done queuing things up, now waiting for results queue to drain 8186 1726776630.24434: waiting for pending results... 8824 1726776630.24549: running TaskExecutor() for managed_node1/TASK: Set profile_mode to auto 8824 1726776630.24646: in run() - task 120fa90a-8a95-f1be-6eb1-00000000009c 8824 1726776630.24662: variable 'ansible_search_path' from source: unknown 8824 1726776630.24667: variable 'ansible_search_path' from source: unknown 8824 1726776630.24693: calling self._execute() 8824 1726776630.24754: variable 'ansible_host' from source: host vars for 'managed_node1' 8824 1726776630.24762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8824 1726776630.24771: variable 'omit' from source: magic vars 8824 1726776630.24847: variable 'omit' from source: magic vars 8824 1726776630.24876: variable 'omit' from source: magic vars 8824 1726776630.24896: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 8824 1726776630.25168: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 8824 1726776630.25250: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8824 1726776630.25280: variable 'omit' from source: magic vars 8824 1726776630.25319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8824 1726776630.25358: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8824 1726776630.25379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8824 1726776630.25397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8824 1726776630.25408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8824 1726776630.25470: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8824 1726776630.25477: variable 'ansible_host' from source: host vars for 'managed_node1' 8824 1726776630.25483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8824 1726776630.25579: Set connection var ansible_shell_executable to /bin/sh 8824 1726776630.25587: Set connection var ansible_timeout to 10 8824 1726776630.25594: Set connection var ansible_module_compression to ZIP_DEFLATED 8824 1726776630.25598: Set connection var ansible_connection to ssh 8824 1726776630.25606: Set connection var ansible_pipelining to False 8824 1726776630.25614: Set connection var ansible_shell_type to sh 8824 1726776630.25634: variable 'ansible_shell_executable' from source: unknown 8824 1726776630.25638: variable 'ansible_connection' from source: unknown 8824 1726776630.25641: variable 'ansible_module_compression' from source: unknown 8824 1726776630.25644: variable 'ansible_shell_type' from source: unknown 8824 1726776630.25646: variable 'ansible_shell_executable' from source: unknown 8824 1726776630.25649: variable 'ansible_host' from source: host vars for 'managed_node1' 8824 1726776630.25652: variable 'ansible_pipelining' from source: unknown 8824 1726776630.25655: variable 'ansible_timeout' from source: unknown 8824 1726776630.25658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8824 1726776630.25782: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8824 1726776630.25794: variable 'omit' from source: magic vars 8824 1726776630.25799: starting attempt loop 8824 1726776630.25802: running the handler 8824 1726776630.25812: _low_level_execute_command(): starting 8824 1726776630.25820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8824 1726776630.28350: stdout chunk (state=2): >>>/root <<< 8824 1726776630.28467: stderr chunk (state=3): >>><<< 8824 1726776630.28474: stdout chunk (state=3): >>><<< 8824 1726776630.28493: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8824 1726776630.28505: _low_level_execute_command(): starting 8824 1726776630.28511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152 `" && echo ansible-tmp-1726776630.285008-8824-135799749882152="` echo /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152 `" ) && sleep 0' 8824 1726776630.31058: stdout chunk (state=2): >>>ansible-tmp-1726776630.285008-8824-135799749882152=/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152 <<< 8824 1726776630.31190: stderr chunk (state=3): >>><<< 8824 1726776630.31199: stdout chunk (state=3): >>><<< 8824 1726776630.31214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.285008-8824-135799749882152=/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152 , stderr= 8824 1726776630.31289: variable 'ansible_module_compression' from source: unknown 8824 1726776630.31337: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8824 1726776630.31370: variable 'ansible_facts' from source: unknown 8824 1726776630.31440: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_stat.py 8824 1726776630.31529: Sending initial data 8824 1726776630.31538: Sent initial data (150 bytes) 8824 1726776630.34025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp7l18cd1o /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_stat.py <<< 8824 1726776630.35116: stderr chunk (state=3): >>><<< 8824 1726776630.35124: stdout chunk (state=3): >>><<< 8824 1726776630.35148: done transferring module to remote 8824 1726776630.35159: _low_level_execute_command(): starting 8824 1726776630.35164: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/ /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_stat.py && sleep 0' 8824 1726776630.37730: stderr chunk (state=2): >>><<< 8824 1726776630.37742: stdout chunk (state=2): >>><<< 8824 1726776630.37757: _low_level_execute_command() done: rc=0, stdout=, stderr= 8824 1726776630.37761: _low_level_execute_command(): starting 8824 1726776630.37768: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_stat.py && sleep 0' 8824 1726776630.54017: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 501219463, "dev": 51713, "nlink": 1, "atime": 1726776627.6647532, "mtime": 1726776627.6877534, "ctime": 1726776627.6877534, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1292828549", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8824 1726776630.55185: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8824 1726776630.55196: stdout chunk (state=3): >>><<< 8824 1726776630.55208: stderr chunk (state=3): >>><<< 8824 1726776630.55224: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 501219463, "dev": 51713, "nlink": 1, "atime": 1726776627.6647532, "mtime": 1726776627.6877534, "ctime": 1726776627.6877534, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "1292828549", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.14.221 closed. 8824 1726776630.55287: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8824 1726776630.55758: Sending initial data 8824 1726776630.55765: Sent initial data (139 bytes) 8824 1726776630.58968: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpmx4z9shi /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source <<< 8824 1726776630.59438: stderr chunk (state=3): >>><<< 8824 1726776630.59447: stdout chunk (state=3): >>><<< 8824 1726776630.59469: _low_level_execute_command(): starting 8824 1726776630.59475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/ /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source && sleep 0' 8824 1726776630.62100: stderr chunk (state=2): >>><<< 8824 1726776630.62111: stdout chunk (state=2): >>><<< 8824 1726776630.62125: _low_level_execute_command() done: rc=0, stdout=, stderr= 8824 1726776630.62149: variable 'ansible_module_compression' from source: unknown 8824 1726776630.62186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8824 1726776630.62204: variable 'ansible_facts' from source: unknown 8824 1726776630.62263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_copy.py 8824 1726776630.62610: Sending initial data 8824 1726776630.62617: Sent initial data (150 bytes) 8824 1726776630.65250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmpzxyzvf1n /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_copy.py <<< 8824 1726776630.66323: stderr chunk (state=3): >>><<< 8824 1726776630.66335: stdout chunk (state=3): >>><<< 8824 1726776630.66353: done transferring module to remote 8824 1726776630.66362: _low_level_execute_command(): starting 8824 1726776630.66368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/ /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_copy.py && sleep 0' 8824 1726776630.68788: stderr chunk (state=2): >>><<< 8824 1726776630.68797: stdout chunk (state=2): >>><<< 8824 1726776630.68810: _low_level_execute_command() done: rc=0, stdout=, stderr= 8824 1726776630.68815: _low_level_execute_command(): starting 8824 1726776630.68820: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/AnsiballZ_copy.py && sleep 0' 8824 1726776630.85107: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source", "_original_basename": "tmpmx4z9shi", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8824 1726776630.86710: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8824 1726776630.86722: stdout chunk (state=3): >>><<< 8824 1726776630.86736: stderr chunk (state=3): >>><<< 8824 1726776630.86751: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source", "_original_basename": "tmpmx4z9shi", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8824 1726776630.86790: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source', '_original_basename': 'tmpmx4z9shi', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8824 1726776630.86803: _low_level_execute_command(): starting 8824 1726776630.86810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/ > /dev/null 2>&1 && sleep 0' 8824 1726776630.89625: stderr chunk (state=2): >>><<< 8824 1726776630.89637: stdout chunk (state=2): >>><<< 8824 1726776630.89655: _low_level_execute_command() done: rc=0, stdout=, stderr= 8824 1726776630.89665: handler run complete 8824 1726776630.89689: attempt loop complete, returning result 8824 1726776630.89695: _execute() done 8824 1726776630.89698: dumping result to json 8824 1726776630.89703: done dumping result, returning 8824 1726776630.89710: done running TaskExecutor() for managed_node1/TASK: Set profile_mode to auto [120fa90a-8a95-f1be-6eb1-00000000009c] 8824 1726776630.89716: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009c 8824 1726776630.89758: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009c changed: [managed_node1] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726776630.285008-8824-135799749882152/source", "state": "file", "uid": 0 } 8186 1726776630.90144: no more pending results, returning what we have 8186 1726776630.90146: results queue empty 8186 1726776630.90147: checking for any_errors_fatal 8186 1726776630.90155: done checking for any_errors_fatal 8186 1726776630.90156: checking for max_fail_percentage 8186 1726776630.90157: done checking for max_fail_percentage 8186 1726776630.90157: checking to see if all hosts have failed and the running result is not ok 8186 1726776630.90158: done checking to see if all hosts have failed 8186 1726776630.90158: getting the remaining hosts for this loop 8186 1726776630.90161: done getting the remaining hosts for this loop 8186 1726776630.90164: getting the next task for host managed_node1 8186 1726776630.90170: done getting next task for host managed_node1 8186 1726776630.90175: ^ task is: TASK: Restart tuned 8186 1726776630.90178: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8186 1726776630.90181: getting variables 8186 1726776630.90182: in VariableManager get_vars() 8186 1726776630.90209: Calling all_inventory to load vars for managed_node1 8186 1726776630.90210: Calling groups_inventory to load vars for managed_node1 8186 1726776630.90211: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776630.90219: Calling all_plugins_play to load vars for managed_node1 8186 1726776630.90221: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776630.90224: Calling groups_plugins_play to load vars for managed_node1 8186 1726776630.90340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776630.90457: done with get_vars() 8186 1726776630.90467: done getting variables 8186 1726776630.90509: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 16:10:30 -0400 (0:00:00.662) 0:00:18.777 **** 8186 1726776630.90531: entering _queue_task() for managed_node1/service 8824 1726776630.89762: WORKER PROCESS EXITING 8186 1726776630.90712: worker is 1 (out of 1 available) 8186 1726776630.90726: exiting _queue_task() for managed_node1/service 8186 1726776630.90739: done queuing things up, now waiting for results queue to drain 8186 1726776630.90741: waiting for pending results... 8861 1726776630.90858: running TaskExecutor() for managed_node1/TASK: Restart tuned 8861 1726776630.90956: in run() - task 120fa90a-8a95-f1be-6eb1-00000000009d 8861 1726776630.90971: variable 'ansible_search_path' from source: unknown 8861 1726776630.90975: variable 'ansible_search_path' from source: unknown 8861 1726776630.91007: variable '__kernel_settings_services' from source: include_vars 8861 1726776630.91235: variable '__kernel_settings_services' from source: include_vars 8861 1726776630.91282: variable 'omit' from source: magic vars 8861 1726776630.91362: variable 'ansible_host' from source: host vars for 'managed_node1' 8861 1726776630.91371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8861 1726776630.91376: variable 'omit' from source: magic vars 8861 1726776630.91433: variable 'omit' from source: magic vars 8861 1726776630.91456: variable 'omit' from source: magic vars 8861 1726776630.91479: variable 'item' from source: unknown 8861 1726776630.91534: variable 'item' from source: unknown 8861 1726776630.91549: variable 'omit' from source: magic vars 8861 1726776630.91576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8861 1726776630.91601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8861 1726776630.91616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8861 1726776630.91684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8861 1726776630.91696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8861 1726776630.91717: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8861 1726776630.91720: variable 'ansible_host' from source: host vars for 'managed_node1' 8861 1726776630.91723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8861 1726776630.91820: Set connection var ansible_shell_executable to /bin/sh 8861 1726776630.91828: Set connection var ansible_timeout to 10 8861 1726776630.91836: Set connection var ansible_module_compression to ZIP_DEFLATED 8861 1726776630.91838: Set connection var ansible_connection to ssh 8861 1726776630.91843: Set connection var ansible_pipelining to False 8861 1726776630.91847: Set connection var ansible_shell_type to sh 8861 1726776630.91859: variable 'ansible_shell_executable' from source: unknown 8861 1726776630.91862: variable 'ansible_connection' from source: unknown 8861 1726776630.91864: variable 'ansible_module_compression' from source: unknown 8861 1726776630.91865: variable 'ansible_shell_type' from source: unknown 8861 1726776630.91867: variable 'ansible_shell_executable' from source: unknown 8861 1726776630.91869: variable 'ansible_host' from source: host vars for 'managed_node1' 8861 1726776630.91871: variable 'ansible_pipelining' from source: unknown 8861 1726776630.91872: variable 'ansible_timeout' from source: unknown 8861 1726776630.91875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8861 1726776630.91966: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 8861 1726776630.91978: variable 'omit' from source: magic vars 8861 1726776630.91983: starting attempt loop 8861 1726776630.91985: running the handler 8861 1726776630.92056: variable 'ansible_facts' from source: unknown 8861 1726776630.92133: _low_level_execute_command(): starting 8861 1726776630.92143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8861 1726776630.94596: stdout chunk (state=2): >>>/root <<< 8861 1726776630.94715: stderr chunk (state=3): >>><<< 8861 1726776630.94723: stdout chunk (state=3): >>><<< 8861 1726776630.94753: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8861 1726776630.94769: _low_level_execute_command(): starting 8861 1726776630.94776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391 `" && echo ansible-tmp-1726776630.9476285-8861-80682363239391="` echo /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391 `" ) && sleep 0' 8861 1726776630.97258: stdout chunk (state=2): >>>ansible-tmp-1726776630.9476285-8861-80682363239391=/root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391 <<< 8861 1726776630.97402: stderr chunk (state=3): >>><<< 8861 1726776630.97412: stdout chunk (state=3): >>><<< 8861 1726776630.97432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726776630.9476285-8861-80682363239391=/root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391 , stderr= 8861 1726776630.97464: variable 'ansible_module_compression' from source: unknown 8861 1726776630.97527: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8186o_g0xblh/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8861 1726776630.97578: variable 'ansible_facts' from source: unknown 8861 1726776630.97743: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/AnsiballZ_systemd.py 8861 1726776630.97847: Sending initial data 8861 1726776630.97854: Sent initial data (153 bytes) 8861 1726776631.00737: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8186o_g0xblh/tmp2m6cj2kz /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/AnsiballZ_systemd.py <<< 8861 1726776631.03406: stderr chunk (state=3): >>><<< 8861 1726776631.03414: stdout chunk (state=3): >>><<< 8861 1726776631.03434: done transferring module to remote 8861 1726776631.03443: _low_level_execute_command(): starting 8861 1726776631.03446: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/ /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/AnsiballZ_systemd.py && sleep 0' 8861 1726776631.05800: stderr chunk (state=2): >>><<< 8861 1726776631.05809: stdout chunk (state=2): >>><<< 8861 1726776631.05823: _low_level_execute_command() done: rc=0, stdout=, stderr= 8861 1726776631.05828: _low_level_execute_command(): starting 8861 1726776631.05835: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/AnsiballZ_systemd.py && sleep 0' 8861 1726776631.33537: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:27 EDT", "WatchdogTimestampMonotonic": "211718989", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9620", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ExecMainStartTimestampMonotonic": "211581820", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9620", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:27 EDT] ; stop_time=[n/a] ; pid=9620 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryH<<< 8861 1726776631.33567: stdout chunk (state=3): >>>igh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:27 EDT", "StateChangeTimestampMonotonic": "211718992", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveExitTimestampMonotonic": "211581969", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveEnterTimestampMonotonic": "211718992", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveExitTimestampMonotonic": "211500516", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveEnterTimestampMonotonic": "211579094", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ConditionTimestampMonotonic": "211580779", "AssertTimestamp": "Thu 2024-09-19 16:10:27 EDT", "AssertTimestampMonotonic": "211580781", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "efa9e6aae8ae4f5f9fe3511c6aa5ceb3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8861 1726776631.35256: stderr chunk (state=3): >>>Shared connection to 10.31.14.221 closed. <<< 8861 1726776631.35298: stderr chunk (state=3): >>><<< 8861 1726776631.35304: stdout chunk (state=3): >>><<< 8861 1726776631.35322: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 16:10:27 EDT", "WatchdogTimestampMonotonic": "211718989", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9620", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ExecMainStartTimestampMonotonic": "211581820", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9620", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:27 EDT] ; stop_time=[n/a] ; pid=9620 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "15007744", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22406", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "Before": "shutdown.target multi-user.target", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 16:10:27 EDT", "StateChangeTimestampMonotonic": "211718992", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveExitTimestampMonotonic": "211581969", "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveEnterTimestampMonotonic": "211718992", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveExitTimestampMonotonic": "211500516", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveEnterTimestampMonotonic": "211579094", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ConditionTimestampMonotonic": "211580779", "AssertTimestamp": "Thu 2024-09-19 16:10:27 EDT", "AssertTimestampMonotonic": "211580781", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "efa9e6aae8ae4f5f9fe3511c6aa5ceb3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.14.221 closed. 8861 1726776631.35432: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8861 1726776631.35451: _low_level_execute_command(): starting 8861 1726776631.35457: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726776630.9476285-8861-80682363239391/ > /dev/null 2>&1 && sleep 0' 8861 1726776631.37823: stderr chunk (state=2): >>><<< 8861 1726776631.37832: stdout chunk (state=2): >>><<< 8861 1726776631.37845: _low_level_execute_command() done: rc=0, stdout=, stderr= 8861 1726776631.37854: handler run complete 8861 1726776631.37888: attempt loop complete, returning result 8861 1726776631.37905: variable 'item' from source: unknown 8861 1726776631.37970: variable 'item' from source: unknown ok: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveEnterTimestampMonotonic": "211718992", "ActiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ActiveExitTimestampMonotonic": "211500516", "ActiveState": "active", "After": "systemd-journald.socket systemd-sysctl.service dbus.service sysinit.target dbus.socket network.target polkit.service basic.target system.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 16:10:27 EDT", "AssertTimestampMonotonic": "211580781", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ConditionTimestampMonotonic": "211580779", "ConfigurationDirectoryMode": "0755", "Conflicts": "cpupower.service tlp.service shutdown.target power-profiles-daemon.service auto-cpufreq.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9620", "ExecMainStartTimestamp": "Thu 2024-09-19 16:10:27 EDT", "ExecMainStartTimestampMonotonic": "211581820", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 16:10:27 EDT] ; stop_time=[n/a] ; pid=9620 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveEnterTimestampMonotonic": "211579094", "InactiveExitTimestamp": "Thu 2024-09-19 16:10:27 EDT", "InactiveExitTimestampMonotonic": "211581969", "InvocationID": "efa9e6aae8ae4f5f9fe3511c6aa5ceb3", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9620", "MemoryAccounting": "yes", "MemoryCurrent": "15007744", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 16:10:27 EDT", "StateChangeTimestampMonotonic": "211718992", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 16:10:27 EDT", "WatchdogTimestampMonotonic": "211718989", "WatchdogUSec": "0" } } 8861 1726776631.38068: dumping result to json 8861 1726776631.38089: done dumping result, returning 8861 1726776631.38097: done running TaskExecutor() for managed_node1/TASK: Restart tuned [120fa90a-8a95-f1be-6eb1-00000000009d] 8861 1726776631.38104: sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009d 8861 1726776631.38212: done sending task result for task 120fa90a-8a95-f1be-6eb1-00000000009d 8861 1726776631.38217: WORKER PROCESS EXITING 8186 1726776631.38558: no more pending results, returning what we have 8186 1726776631.38560: results queue empty 8186 1726776631.38561: checking for any_errors_fatal 8186 1726776631.38564: done checking for any_errors_fatal 8186 1726776631.38564: checking for max_fail_percentage 8186 1726776631.38565: done checking for max_fail_percentage 8186 1726776631.38565: checking to see if all hosts have failed and the running result is not ok 8186 1726776631.38566: done checking to see if all hosts have failed 8186 1726776631.38566: getting the remaining hosts for this loop 8186 1726776631.38567: done getting the remaining hosts for this loop 8186 1726776631.38569: getting the next task for host managed_node1 8186 1726776631.38573: done getting next task for host managed_node1 8186 1726776631.38574: ^ task is: TASK: meta (flush_handlers) 8186 1726776631.38575: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776631.38579: getting variables 8186 1726776631.38580: in VariableManager get_vars() 8186 1726776631.38600: Calling all_inventory to load vars for managed_node1 8186 1726776631.38602: Calling groups_inventory to load vars for managed_node1 8186 1726776631.38603: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776631.38610: Calling all_plugins_play to load vars for managed_node1 8186 1726776631.38611: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776631.38613: Calling groups_plugins_play to load vars for managed_node1 8186 1726776631.38722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776631.38831: done with get_vars() 8186 1726776631.38839: done getting variables 8186 1726776631.38886: in VariableManager get_vars() 8186 1726776631.38894: Calling all_inventory to load vars for managed_node1 8186 1726776631.38896: Calling groups_inventory to load vars for managed_node1 8186 1726776631.38897: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776631.38900: Calling all_plugins_play to load vars for managed_node1 8186 1726776631.38901: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776631.38902: Calling groups_plugins_play to load vars for managed_node1 8186 1726776631.38982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776631.39085: done with get_vars() 8186 1726776631.39094: done queuing things up, now waiting for results queue to drain 8186 1726776631.39095: results queue empty 8186 1726776631.39096: checking for any_errors_fatal 8186 1726776631.39100: done checking for any_errors_fatal 8186 1726776631.39100: checking for max_fail_percentage 8186 1726776631.39101: done checking for max_fail_percentage 8186 1726776631.39101: checking to see if all hosts have failed and the running result is not ok 8186 1726776631.39102: done checking to see if all hosts have failed 8186 1726776631.39102: getting the remaining hosts for this loop 8186 1726776631.39102: done getting the remaining hosts for this loop 8186 1726776631.39104: getting the next task for host managed_node1 8186 1726776631.39107: done getting next task for host managed_node1 8186 1726776631.39108: ^ task is: TASK: meta (flush_handlers) 8186 1726776631.39108: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776631.39111: getting variables 8186 1726776631.39111: in VariableManager get_vars() 8186 1726776631.39117: Calling all_inventory to load vars for managed_node1 8186 1726776631.39118: Calling groups_inventory to load vars for managed_node1 8186 1726776631.39119: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776631.39122: Calling all_plugins_play to load vars for managed_node1 8186 1726776631.39123: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776631.39125: Calling groups_plugins_play to load vars for managed_node1 8186 1726776631.39203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776631.39305: done with get_vars() 8186 1726776631.39311: done getting variables 8186 1726776631.39341: in VariableManager get_vars() 8186 1726776631.39348: Calling all_inventory to load vars for managed_node1 8186 1726776631.39349: Calling groups_inventory to load vars for managed_node1 8186 1726776631.39350: Calling all_plugins_inventory to load vars for managed_node1 8186 1726776631.39353: Calling all_plugins_play to load vars for managed_node1 8186 1726776631.39354: Calling groups_plugins_inventory to load vars for managed_node1 8186 1726776631.39356: Calling groups_plugins_play to load vars for managed_node1 8186 1726776631.39434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8186 1726776631.39535: done with get_vars() 8186 1726776631.39543: done queuing things up, now waiting for results queue to drain 8186 1726776631.39544: results queue empty 8186 1726776631.39545: checking for any_errors_fatal 8186 1726776631.39546: done checking for any_errors_fatal 8186 1726776631.39546: checking for max_fail_percentage 8186 1726776631.39547: done checking for max_fail_percentage 8186 1726776631.39547: checking to see if all hosts have failed and the running result is not ok 8186 1726776631.39547: done checking to see if all hosts have failed 8186 1726776631.39548: getting the remaining hosts for this loop 8186 1726776631.39548: done getting the remaining hosts for this loop 8186 1726776631.39549: getting the next task for host managed_node1 8186 1726776631.39551: done getting next task for host managed_node1 8186 1726776631.39552: ^ task is: None 8186 1726776631.39552: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8186 1726776631.39553: done queuing things up, now waiting for results queue to drain 8186 1726776631.39553: results queue empty 8186 1726776631.39554: checking for any_errors_fatal 8186 1726776631.39554: done checking for any_errors_fatal 8186 1726776631.39554: checking for max_fail_percentage 8186 1726776631.39555: done checking for max_fail_percentage 8186 1726776631.39555: checking to see if all hosts have failed and the running result is not ok 8186 1726776631.39555: done checking to see if all hosts have failed 8186 1726776631.39556: getting the next task for host managed_node1 8186 1726776631.39558: done getting next task for host managed_node1 8186 1726776631.39558: ^ task is: None 8186 1726776631.39559: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=33 changed=8 unreachable=0 failed=0 skipped=8 rescued=1 ignored=1 Thursday 19 September 2024 16:10:31 -0400 (0:00:00.490) 0:00:19.268 **** =============================================================================== fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 5.89s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Gathering Facts --------------------------------------------------------- 2.21s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:2 fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.91s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.74s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.72s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 0.71s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Ensure kernel_settings is not in active_profile ------------------------- 0.70s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.68s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Set profile_mode to auto ------------------------------------------------ 0.66s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.66s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly --- 0.62s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists --- 0.50s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Restart tuned ----------------------------------------------------------- 0.49s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 fedora.linux_system_roles.kernel_settings : Check if system is ostree --- 0.43s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 fedora.linux_system_roles.kernel_settings : Read tuned main config ------ 0.42s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 fedora.linux_system_roles.kernel_settings : Get active_profile ---------- 0.41s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 fedora.linux_system_roles.kernel_settings : Get current config ---------- 0.34s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Verify no settings ------------------------------------------------------ 0.34s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Remove kernel_settings tuned profile ------------------------------------ 0.33s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Get active_profile ------------------------------------------------------ 0.33s /tmp/collections-uMf/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 8186 1726776631.39628: RUNNING CLEANUP