[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 8208 1726773019.13278: starting run ansible-playbook [core 2.16.11] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-EI7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 8208 1726773019.13588: Added group all to inventory 8208 1726773019.13589: Added group ungrouped to inventory 8208 1726773019.13592: Group all now contains ungrouped 8208 1726773019.13594: Examining possible inventory source: /tmp/kernel_settings-PVh/inventory.yml 8208 1726773019.22400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 8208 1726773019.22442: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 8208 1726773019.22460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 8208 1726773019.22501: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 8208 1726773019.22547: Loaded config def from plugin (inventory/script) 8208 1726773019.22549: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 8208 1726773019.22578: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 8208 1726773019.22637: Loaded config def from plugin (inventory/yaml) 8208 1726773019.22638: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 8208 1726773019.22701: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 8208 1726773019.22975: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 8208 1726773019.22978: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 8208 1726773019.22980: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 8208 1726773019.22984: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 8208 1726773019.22989: Loading data from /tmp/kernel_settings-PVh/inventory.yml 8208 1726773019.23030: /tmp/kernel_settings-PVh/inventory.yml was not parsable by auto 8208 1726773019.23074: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 8208 1726773019.23103: Loading data from /tmp/kernel_settings-PVh/inventory.yml 8208 1726773019.23158: group all already in inventory 8208 1726773019.23163: set inventory_file for managed_node1 8208 1726773019.23166: set inventory_dir for managed_node1 8208 1726773019.23166: Added host managed_node1 to inventory 8208 1726773019.23168: Added host managed_node1 to group all 8208 1726773019.23168: set ansible_host for managed_node1 8208 1726773019.23169: set ansible_ssh_extra_args for managed_node1 8208 1726773019.23170: set inventory_file for managed_node2 8208 1726773019.23172: set inventory_dir for managed_node2 8208 1726773019.23172: Added host managed_node2 to inventory 8208 1726773019.23173: Added host managed_node2 to group all 8208 1726773019.23174: set ansible_host for managed_node2 8208 1726773019.23174: set ansible_ssh_extra_args for managed_node2 8208 1726773019.23175: set inventory_file for managed_node3 8208 1726773019.23177: set inventory_dir for managed_node3 8208 1726773019.23177: Added host managed_node3 to inventory 8208 1726773019.23178: Added host managed_node3 to group all 8208 1726773019.23179: set ansible_host for managed_node3 8208 1726773019.23179: set ansible_ssh_extra_args for managed_node3 8208 1726773019.23181: Reconcile groups and hosts in inventory. 8208 1726773019.23183: Group ungrouped now contains managed_node1 8208 1726773019.23184: Group ungrouped now contains managed_node2 8208 1726773019.23187: Group ungrouped now contains managed_node3 8208 1726773019.23239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 8208 1726773019.23323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 8208 1726773019.23353: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 8208 1726773019.23373: Loaded config def from plugin (vars/host_group_vars) 8208 1726773019.23374: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 8208 1726773019.23379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 8208 1726773019.23384: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 8208 1726773019.23413: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 8208 1726773019.23719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773019.23805: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 8208 1726773019.23838: Loaded config def from plugin (connection/local) 8208 1726773019.23841: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 8208 1726773019.24373: Loaded config def from plugin (connection/paramiko_ssh) 8208 1726773019.24377: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 8208 1726773019.25159: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8208 1726773019.25202: Loaded config def from plugin (connection/psrp) 8208 1726773019.25205: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 8208 1726773019.25878: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8208 1726773019.25923: Loaded config def from plugin (connection/ssh) 8208 1726773019.25927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 8208 1726773019.27622: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 8208 1726773019.27667: Loaded config def from plugin (connection/winrm) 8208 1726773019.27672: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 8208 1726773019.27706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 8208 1726773019.27778: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 8208 1726773019.27844: Loaded config def from plugin (shell/cmd) 8208 1726773019.27846: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 8208 1726773019.27878: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 8208 1726773019.27941: Loaded config def from plugin (shell/powershell) 8208 1726773019.27942: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 8208 1726773019.28010: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 8208 1726773019.28142: Loaded config def from plugin (shell/sh) 8208 1726773019.28144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 8208 1726773019.28177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 8208 1726773019.28253: Loaded config def from plugin (become/runas) 8208 1726773019.28256: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 8208 1726773019.28368: Loaded config def from plugin (become/su) 8208 1726773019.28370: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 8208 1726773019.28466: Loaded config def from plugin (become/sudo) 8208 1726773019.28468: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 8208 1726773019.28495: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml 8208 1726773019.28828: trying /usr/local/lib/python3.12/site-packages/ansible/modules 8208 1726773019.30846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 8208 1726773019.30992: in VariableManager get_vars() 8208 1726773019.31006: done with get_vars() 8208 1726773019.31038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 8208 1726773019.31046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 8208 1726773019.31260: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 8208 1726773019.31359: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 8208 1726773019.31361: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 8208 1726773019.31382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 8208 1726773019.31400: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 8208 1726773019.31497: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 8208 1726773019.31533: Loaded config def from plugin (callback/default) 8208 1726773019.31535: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8208 1726773019.32269: Loaded config def from plugin (callback/junit) 8208 1726773019.32271: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8208 1726773019.32304: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 8208 1726773019.32343: Loaded config def from plugin (callback/minimal) 8208 1726773019.32344: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8208 1726773019.32374: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 8208 1726773019.32413: Loaded config def from plugin (callback/tree) 8208 1726773019.32415: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 8208 1726773019.32625: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 8208 1726773019.32626: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-EI7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bool_not_allowed.yml ******************************************* 1 plays in /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml 8208 1726773019.32645: in VariableManager get_vars() 8208 1726773019.32660: done with get_vars() 8208 1726773019.32665: in VariableManager get_vars() 8208 1726773019.32670: done with get_vars() 8208 1726773019.32673: variable 'omit' from source: magic vars 8208 1726773019.32700: in VariableManager get_vars() 8208 1726773019.32709: done with get_vars() 8208 1726773019.32722: variable 'omit' from source: magic vars PLAY [Test boolean values not allowed] ***************************************** 8208 1726773019.33114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 8208 1726773019.33169: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 8208 1726773019.33195: getting the remaining hosts for this loop 8208 1726773019.33196: done getting the remaining hosts for this loop 8208 1726773019.33198: getting the next task for host managed_node1 8208 1726773019.33200: done getting next task for host managed_node1 8208 1726773019.33201: ^ task is: TASK: Gathering Facts 8208 1726773019.33202: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773019.33207: getting variables 8208 1726773019.33208: in VariableManager get_vars() 8208 1726773019.33215: Calling all_inventory to load vars for managed_node1 8208 1726773019.33216: Calling groups_inventory to load vars for managed_node1 8208 1726773019.33218: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773019.33227: Calling all_plugins_play to load vars for managed_node1 8208 1726773019.33234: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773019.33236: Calling groups_plugins_play to load vars for managed_node1 8208 1726773019.33270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773019.33306: done with get_vars() 8208 1726773019.33310: done getting variables 8208 1726773019.33361: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:2 Thursday 19 September 2024 15:10:19 -0400 (0:00:00.008) 0:00:00.008 **** 8208 1726773019.33376: entering _queue_task() for managed_node1/gather_facts 8208 1726773019.33377: Creating lock for gather_facts 8208 1726773019.33602: worker is 1 (out of 1 available) 8208 1726773019.33613: exiting _queue_task() for managed_node1/gather_facts 8208 1726773019.33623: done queuing things up, now waiting for results queue to drain 8208 1726773019.33624: waiting for pending results... 8211 1726773019.33699: running TaskExecutor() for managed_node1/TASK: Gathering Facts 8211 1726773019.33795: in run() - task 0affffe7-6841-f581-0619-00000000000d 8211 1726773019.33810: variable 'ansible_search_path' from source: unknown 8211 1726773019.33839: calling self._execute() 8211 1726773019.33886: variable 'ansible_host' from source: host vars for 'managed_node1' 8211 1726773019.33895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8211 1726773019.33903: variable 'omit' from source: magic vars 8211 1726773019.33970: variable 'omit' from source: magic vars 8211 1726773019.33995: variable 'omit' from source: magic vars 8211 1726773019.34018: variable 'omit' from source: magic vars 8211 1726773019.34051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8211 1726773019.34079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8211 1726773019.34100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8211 1726773019.34115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8211 1726773019.34126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8211 1726773019.34148: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8211 1726773019.34154: variable 'ansible_host' from source: host vars for 'managed_node1' 8211 1726773019.34158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8211 1726773019.34227: Set connection var ansible_shell_executable to /bin/sh 8211 1726773019.34233: Set connection var ansible_connection to ssh 8211 1726773019.34240: Set connection var ansible_module_compression to ZIP_DEFLATED 8211 1726773019.34247: Set connection var ansible_timeout to 10 8211 1726773019.34250: Set connection var ansible_shell_type to sh 8211 1726773019.34257: Set connection var ansible_pipelining to False 8211 1726773019.34274: variable 'ansible_shell_executable' from source: unknown 8211 1726773019.34278: variable 'ansible_connection' from source: unknown 8211 1726773019.34281: variable 'ansible_module_compression' from source: unknown 8211 1726773019.34286: variable 'ansible_shell_type' from source: unknown 8211 1726773019.34290: variable 'ansible_shell_executable' from source: unknown 8211 1726773019.34293: variable 'ansible_host' from source: host vars for 'managed_node1' 8211 1726773019.34298: variable 'ansible_pipelining' from source: unknown 8211 1726773019.34301: variable 'ansible_timeout' from source: unknown 8211 1726773019.34305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8211 1726773019.34428: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 8211 1726773019.34438: variable 'omit' from source: magic vars 8211 1726773019.34443: starting attempt loop 8211 1726773019.34447: running the handler 8211 1726773019.34459: variable 'ansible_facts' from source: unknown 8211 1726773019.34475: _low_level_execute_command(): starting 8211 1726773019.34482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8211 1726773019.51747: stderr chunk (state=2): >>>Warning: Permanently added '10.31.43.7' (ECDSA) to the list of known hosts. <<< 8211 1726773020.01444: stdout chunk (state=3): >>>/root <<< 8211 1726773020.01692: stderr chunk (state=3): >>><<< 8211 1726773020.01700: stdout chunk (state=3): >>><<< 8211 1726773020.01720: _low_level_execute_command() done: rc=0, stdout=/root , stderr=Warning: Permanently added '10.31.43.7' (ECDSA) to the list of known hosts. 8211 1726773020.01734: _low_level_execute_command(): starting 8211 1726773020.01740: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790 `" && echo ansible-tmp-1726773020.017286-8211-58718323230790="` echo /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790 `" ) && sleep 0' 8211 1726773020.04348: stdout chunk (state=2): >>>ansible-tmp-1726773020.017286-8211-58718323230790=/root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790 <<< 8211 1726773020.04477: stderr chunk (state=3): >>><<< 8211 1726773020.04487: stdout chunk (state=3): >>><<< 8211 1726773020.04503: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773020.017286-8211-58718323230790=/root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790 , stderr= 8211 1726773020.04527: variable 'ansible_module_compression' from source: unknown 8211 1726773020.04582: ANSIBALLZ: Using generic lock for ansible.legacy.setup 8211 1726773020.04589: ANSIBALLZ: Acquiring lock 8211 1726773020.04592: ANSIBALLZ: Lock acquired: 139627423671568 8211 1726773020.04596: ANSIBALLZ: Creating module 8211 1726773020.26444: ANSIBALLZ: Writing module into payload 8211 1726773020.26562: ANSIBALLZ: Writing module 8211 1726773020.26588: ANSIBALLZ: Renaming module 8211 1726773020.26595: ANSIBALLZ: Done creating module 8211 1726773020.26623: variable 'ansible_facts' from source: unknown 8211 1726773020.26629: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8211 1726773020.26638: _low_level_execute_command(): starting 8211 1726773020.26644: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0' 8211 1726773020.29014: stdout chunk (state=2): >>>PLATFORM <<< 8211 1726773020.29071: stdout chunk (state=3): >>>Linux <<< 8211 1726773020.29087: stdout chunk (state=3): >>>FOUND <<< 8211 1726773020.29095: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 8211 1726773020.29110: stdout chunk (state=3): >>>/usr/bin/python3.6 <<< 8211 1726773020.29128: stdout chunk (state=3): >>>/usr/bin/python3 <<< 8211 1726773020.29137: stdout chunk (state=3): >>>/usr/libexec/platform-python ENDFOUND <<< 8211 1726773020.29283: stderr chunk (state=3): >>><<< 8211 1726773020.29291: stdout chunk (state=3): >>><<< 8211 1726773020.29305: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3.6 /usr/bin/python3 /usr/libexec/platform-python ENDFOUND , stderr= 8211 1726773020.29312 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3.6', '/usr/bin/python3', '/usr/libexec/platform-python'] 8211 1726773020.29347: _low_level_execute_command(): starting 8211 1726773020.29354: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 8211 1726773020.29432: Sending initial data 8211 1726773020.29439: Sent initial data (1234 bytes) 8211 1726773020.33403: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 8211 1726773020.33826: stderr chunk (state=3): >>><<< 8211 1726773020.33834: stdout chunk (state=3): >>><<< 8211 1726773020.33848: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"8\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"8\"\nPLATFORM_ID=\"platform:el8\"\nPRETTY_NAME=\"CentOS Stream 8\"\nANSI_COLOR=\"0;31\"\nCPE_NAME=\"cpe:/o:centos:centos:8\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 8\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr= 8211 1726773020.33896: variable 'ansible_facts' from source: unknown 8211 1726773020.33902: variable 'ansible_facts' from source: unknown 8211 1726773020.33911: variable 'ansible_module_compression' from source: unknown 8211 1726773020.33941: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 8211 1726773020.33969: variable 'ansible_facts' from source: unknown 8211 1726773020.34113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/AnsiballZ_setup.py 8211 1726773020.34269: Sending initial data 8211 1726773020.34275: Sent initial data (150 bytes) 8211 1726773020.37186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpnqmpixpb /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/AnsiballZ_setup.py <<< 8211 1726773020.39374: stderr chunk (state=3): >>><<< 8211 1726773020.39384: stdout chunk (state=3): >>><<< 8211 1726773020.39408: done transferring module to remote 8211 1726773020.39418: _low_level_execute_command(): starting 8211 1726773020.39424: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/ /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/AnsiballZ_setup.py && sleep 0' 8211 1726773020.41901: stderr chunk (state=2): >>><<< 8211 1726773020.41912: stdout chunk (state=2): >>><<< 8211 1726773020.41928: _low_level_execute_command() done: rc=0, stdout=, stderr= 8211 1726773020.41933: _low_level_execute_command(): starting 8211 1726773020.41938: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/AnsiballZ_setup.py && sleep 0' 8211 1726773021.56110: stdout chunk (state=2): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-43-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-43-7", "ansible_nodename": "ip-10-31-43-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "20fa030197864b10a41fb071d9598253", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAOX19V0zY/KlpCQJrZ5cI5y01tpqtD9fUiC8EwirmEytFH+WjBHX20adCUkHx7bOLIKUCt6qPduXi+WiCpcvOeQvgTUKUWe0O7jpzbDHQ4WXfVsO7LbjTELqGJm2sriqGvmHIvoUCfS7CDgktMO2d+ZAJi4+xQFAOrd9SwI40H7ZAAAAFQCezGgzCEWAMo8wxJhelGdrqe0/kQAAAIEAlJDiGmO86y2cun+hrUZuXNpdaNwhKpzsxdzSIcFaXQ056Z5gFzeN7PE0QK6SnYeVBGg4KgJu/d2FrjGmorKslT82XS9oTIz0bGUow9d14UuvMtQNNU7CjwLybB9JsNW0URLkAsSoUJ9PKmBhLrj+1WZ2vfYBaiOhIvcHddz3VHkAAACBAJ3X1HAIvgrl5mVPvyV7dLqRs5IQ0jfKybX9pS/zF23g1ws48AuEUJicYEVVOTjBgUiYc0f/nXaLZmYN3yMrWHw4BWeaYYNreYQT7jGpOcdk0VYALiVaj9OI63vW2sBugkzlHsXUUtQ1l/bFyNuB0lJjAFwCjrDh0j8oXmtdhjVA", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDWmljFk0RvpQgjecIRIrhzQI4vUtvdgRjnYjpXCfXORbOYCpUQp/Gvx+FHH6uy2NXH3+kcbbI2G4QnLi5cjqF7L0tResLhlszj2KAgaph8W7WfjCW/zZHBuz9EKnbgZ0ioF91H9G0Lc6ROm0bUgT08AFV3bdw1WQghoF/kw7MD0w38Xf+k1Y8S0xUJzGQeB+t7cRUPLla0bHZ4Gdq9zg83HDKne6TrrHRd1tIYApokXt5WiPfFYaASDEeVTE6mNSJi+nA6Y5Y3Ls81uPW50FzuUKHlGGk6RUeFd86jYr9Yqzce3YjqNdT36W63QpEvK6LAJOUW0NWS0ab/ebqjRDo7ItLirGj6RacgoYebnf89PhSGuuLtLywQZdtTdMNfaHt/issw1szpWbgLKXEdBC2N0mf8vsXoJ86QULJATC+Dn6ireTqVjN4R9+qGLtt54/1+VUlK/28L1x8nf6nYHJnlSNfXGJKlUuMyI9rtJMqUiX0W+oAojCkxFMUFxgFAKYc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEeAvgJYu+0BNOsBtoPrzKcky5eNDOtKWWOEHIb50AaaCHVKgTx2r3P0MzAUxNITrjwkf+uT/XjTZnnMN00HtMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICa6nDCUccezdRfsYRxb8T+73hu4elRZWMfVvQIgVbYe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "20", "epoch": "1726773020", "epoch_int": "1726773020", "date": "2024-09-19", "time": "15:10:20", "iso8601_micro": "2024-09-19T19:10:20.885236Z", "iso8601": "2024-09-19T19:10:20Z", "iso8601_basic": "20240919T151020885236", "iso8601_basic_short": "20240919T151020", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status<<< 8211 1726773021.56129: stdout chunk (state=3): >>>": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:51:fc:65:fe:b7", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.43.7", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::c51:fcff:fe65:feb7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmenta<<< 8211 1726773021.56159: stdout chunk (state=3): >>>tion": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.43.7", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:51:fc:65:fe:b7", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.43.7"], "ansible_all_ipv6_addresses": ["fe80::c51:fcff:fe65:feb7"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.43.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::c51:fcff:fe65:feb7"]}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 42504 10.31.43.7 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 42504 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.45, "5m": 0.33, "15m": 0.15}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2713, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 826, "free": 2713}, "nocache": {"free": 3300, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis<<< 8211 1726773021.56170: stdout chunk (state=3): >>>_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21a565-f240-bade-d1f7-c623ebbc3a3c", "ansible_product_uuid": "ec21a565-f240-bade-d1f7-c623ebbc3a3c", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 275, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481700352, "block_size": 4096, "block_total": 65533179, "block_available": 64326587, "block_used": 1206592, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_pkg_mgr": "dnf", "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 8211 1726773021.57821: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8211 1726773021.57830: stdout chunk (state=3): >>><<< 8211 1726773021.57841: stderr chunk (state=3): >>><<< 8211 1726773021.57871: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "4.18.0-553.5.1.el8.x86_64", "ansible_kernel_version": "#1 SMP Tue May 21 05:46:01 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.6.8", "ansible_fqdn": "ip-10-31-43-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-43-7", "ansible_nodename": "ip-10-31-43-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "20fa030197864b10a41fb071d9598253", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "8", "ansible_distribution_major_version": "8", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAOX19V0zY/KlpCQJrZ5cI5y01tpqtD9fUiC8EwirmEytFH+WjBHX20adCUkHx7bOLIKUCt6qPduXi+WiCpcvOeQvgTUKUWe0O7jpzbDHQ4WXfVsO7LbjTELqGJm2sriqGvmHIvoUCfS7CDgktMO2d+ZAJi4+xQFAOrd9SwI40H7ZAAAAFQCezGgzCEWAMo8wxJhelGdrqe0/kQAAAIEAlJDiGmO86y2cun+hrUZuXNpdaNwhKpzsxdzSIcFaXQ056Z5gFzeN7PE0QK6SnYeVBGg4KgJu/d2FrjGmorKslT82XS9oTIz0bGUow9d14UuvMtQNNU7CjwLybB9JsNW0URLkAsSoUJ9PKmBhLrj+1WZ2vfYBaiOhIvcHddz3VHkAAACBAJ3X1HAIvgrl5mVPvyV7dLqRs5IQ0jfKybX9pS/zF23g1ws48AuEUJicYEVVOTjBgUiYc0f/nXaLZmYN3yMrWHw4BWeaYYNreYQT7jGpOcdk0VYALiVaj9OI63vW2sBugkzlHsXUUtQ1l/bFyNuB0lJjAFwCjrDh0j8oXmtdhjVA", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDWmljFk0RvpQgjecIRIrhzQI4vUtvdgRjnYjpXCfXORbOYCpUQp/Gvx+FHH6uy2NXH3+kcbbI2G4QnLi5cjqF7L0tResLhlszj2KAgaph8W7WfjCW/zZHBuz9EKnbgZ0ioF91H9G0Lc6ROm0bUgT08AFV3bdw1WQghoF/kw7MD0w38Xf+k1Y8S0xUJzGQeB+t7cRUPLla0bHZ4Gdq9zg83HDKne6TrrHRd1tIYApokXt5WiPfFYaASDEeVTE6mNSJi+nA6Y5Y3Ls81uPW50FzuUKHlGGk6RUeFd86jYr9Yqzce3YjqNdT36W63QpEvK6LAJOUW0NWS0ab/ebqjRDo7ItLirGj6RacgoYebnf89PhSGuuLtLywQZdtTdMNfaHt/issw1szpWbgLKXEdBC2N0mf8vsXoJ86QULJATC+Dn6ireTqVjN4R9+qGLtt54/1+VUlK/28L1x8nf6nYHJnlSNfXGJKlUuMyI9rtJMqUiX0W+oAojCkxFMUFxgFAKYc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEeAvgJYu+0BNOsBtoPrzKcky5eNDOtKWWOEHIb50AaaCHVKgTx2r3P0MzAUxNITrjwkf+uT/XjTZnnMN00HtMM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICa6nDCUccezdRfsYRxb8T+73hu4elRZWMfVvQIgVbYe", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 6, "micro": 8, "releaselevel": "final", "serial": 0}, "version_info": [3, 6, 8, "final", 0], "executable": "/usr/libexec/platform-python", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Thursday", "weekday_number": "4", "weeknumber": "38", "day": "19", "hour": "15", "minute": "10", "second": "20", "epoch": "1726773020", "epoch_int": "1726773020", "date": "2024-09-19", "time": "15:10:20", "iso8601_micro": "2024-09-19T19:10:20.885236Z", "iso8601": "2024-09-19T19:10:20Z", "iso8601_basic": "20240919T151020885236", "iso8601_basic_short": "20240919T151020", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-4.18.0-553.5.1.el8.x86_64", "root": "UUID=fe591198-9082-4b15-9b62-e83518524cd2", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "quiet": true}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0e:51:fc:65:fe:b7", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.43.7", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22"}, "ipv6": [{"address": "fe80::c51:fcff:fe65:feb7", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "rx_udp_gro_forwarding": "off", "rx_gro_list": "off", "tls_hw_rx_offload": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.40.1", "interface": "eth0", "address": "10.31.43.7", "broadcast": "10.31.43.255", "netmask": "255.255.252.0", "network": "10.31.40.0", "prefix": "22", "macaddress": "0e:51:fc:65:fe:b7", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.43.7"], "ansible_all_ipv6_addresses": ["fe80::c51:fcff:fe65:feb7"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.43.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::c51:fcff:fe65:feb7"]}, "ansible_env": {"LS_COLORS": "", "SSH_CONNECTION": "10.31.14.7 42504 10.31.43.7 22", "_": "/usr/libexec/platform-python", "LANG": "en_US.UTF-8", "which_declare": "declare -f", "XDG_SESSION_ID": "5", "USER": "root", "SELINUX_ROLE_REQUESTED": "", "PWD": "/root", "HOME": "/root", "SSH_CLIENT": "10.31.14.7 42504 22", "SELINUX_LEVEL_REQUESTED": "", "SSH_TTY": "/dev/pts/0", "SHELL": "/bin/bash", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "2", "LOGNAME": "root", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "XDG_RUNTIME_DIR": "/run/user/0", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.45, "5m": 0.33, "15m": 0.15}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3539, "ansible_memfree_mb": 2713, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3539, "used": 826, "free": 2713}, "nocache": {"free": 3300, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec21a565-f240-bade-d1f7-c623ebbc3a3c", "ansible_product_uuid": "ec21a565-f240-bade-d1f7-c623ebbc3a3c", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "0", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["fe591198-9082-4b15-9b62-e83518524cd2"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["fe591198-9082-4b15-9b62-e83518524cd2"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 275, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 268423901184, "size_available": 263481700352, "block_size": 4096, "block_total": 65533179, "block_available": 64326587, "block_used": 1206592, "inode_total": 131071472, "inode_available": 130994307, "inode_used": 77165, "uuid": "fe591198-9082-4b15-9b62-e83518524cd2"}], "ansible_pkg_mgr": "dnf", "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=Shared connection to 10.31.43.7 closed. 8211 1726773021.59187: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8211 1726773021.59212: _low_level_execute_command(): starting 8211 1726773021.59219: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773020.017286-8211-58718323230790/ > /dev/null 2>&1 && sleep 0' 8211 1726773021.62234: stderr chunk (state=2): >>><<< 8211 1726773021.62245: stdout chunk (state=2): >>><<< 8211 1726773021.62266: _low_level_execute_command() done: rc=0, stdout=, stderr= 8211 1726773021.62277: handler run complete 8211 1726773021.62395: variable 'ansible_facts' from source: unknown 8211 1726773021.62492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8211 1726773021.62798: variable 'ansible_facts' from source: unknown 8211 1726773021.62898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8211 1726773021.63028: attempt loop complete, returning result 8211 1726773021.63035: _execute() done 8211 1726773021.63039: dumping result to json 8211 1726773021.63068: done dumping result, returning 8211 1726773021.63076: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affffe7-6841-f581-0619-00000000000d] 8211 1726773021.63081: sending task result for task 0affffe7-6841-f581-0619-00000000000d 8211 1726773021.63260: done sending task result for task 0affffe7-6841-f581-0619-00000000000d 8211 1726773021.63264: WORKER PROCESS EXITING ok: [managed_node1] 8208 1726773021.63907: no more pending results, returning what we have 8208 1726773021.63910: results queue empty 8208 1726773021.63910: checking for any_errors_fatal 8208 1726773021.63912: done checking for any_errors_fatal 8208 1726773021.63912: checking for max_fail_percentage 8208 1726773021.63914: done checking for max_fail_percentage 8208 1726773021.63914: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.63915: done checking to see if all hosts have failed 8208 1726773021.63915: getting the remaining hosts for this loop 8208 1726773021.63917: done getting the remaining hosts for this loop 8208 1726773021.63920: getting the next task for host managed_node1 8208 1726773021.63928: done getting next task for host managed_node1 8208 1726773021.63929: ^ task is: TASK: meta (flush_handlers) 8208 1726773021.63931: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.63934: getting variables 8208 1726773021.63935: in VariableManager get_vars() 8208 1726773021.63960: Calling all_inventory to load vars for managed_node1 8208 1726773021.63963: Calling groups_inventory to load vars for managed_node1 8208 1726773021.63965: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.63974: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.63976: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.63978: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.66420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.66613: done with get_vars() 8208 1726773021.66623: done getting variables 8208 1726773021.66668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 8208 1726773021.66726: in VariableManager get_vars() 8208 1726773021.66736: Calling all_inventory to load vars for managed_node1 8208 1726773021.66738: Calling groups_inventory to load vars for managed_node1 8208 1726773021.66740: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.66745: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.66747: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.66749: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.66883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.67072: done with get_vars() 8208 1726773021.67088: done queuing things up, now waiting for results queue to drain 8208 1726773021.67090: results queue empty 8208 1726773021.67091: checking for any_errors_fatal 8208 1726773021.67095: done checking for any_errors_fatal 8208 1726773021.67095: checking for max_fail_percentage 8208 1726773021.67096: done checking for max_fail_percentage 8208 1726773021.67097: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.67097: done checking to see if all hosts have failed 8208 1726773021.67098: getting the remaining hosts for this loop 8208 1726773021.67099: done getting the remaining hosts for this loop 8208 1726773021.67101: getting the next task for host managed_node1 8208 1726773021.67105: done getting next task for host managed_node1 8208 1726773021.67107: ^ task is: TASK: Try to pass a boolean value for sysctl value 8208 1726773021.67108: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.67110: getting variables 8208 1726773021.67111: in VariableManager get_vars() 8208 1726773021.67118: Calling all_inventory to load vars for managed_node1 8208 1726773021.67121: Calling groups_inventory to load vars for managed_node1 8208 1726773021.67123: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.67128: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.67130: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.67133: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.67268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.67474: done with get_vars() 8208 1726773021.67481: done getting variables TASK [Try to pass a boolean value for sysctl value] **************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:7 Thursday 19 September 2024 15:10:21 -0400 (0:00:02.341) 0:00:02.349 **** 8208 1726773021.67553: entering _queue_task() for managed_node1/include_role 8208 1726773021.67557: Creating lock for include_role 8208 1726773021.67801: worker is 1 (out of 1 available) 8208 1726773021.67815: exiting _queue_task() for managed_node1/include_role 8208 1726773021.67825: done queuing things up, now waiting for results queue to drain 8208 1726773021.67827: waiting for pending results... 8250 1726773021.68260: running TaskExecutor() for managed_node1/TASK: Try to pass a boolean value for sysctl value 8250 1726773021.68380: in run() - task 0affffe7-6841-f581-0619-000000000006 8250 1726773021.68400: variable 'ansible_search_path' from source: unknown 8250 1726773021.68434: calling self._execute() 8250 1726773021.68501: variable 'ansible_host' from source: host vars for 'managed_node1' 8250 1726773021.68513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8250 1726773021.68523: variable 'omit' from source: magic vars 8250 1726773021.68620: _execute() done 8250 1726773021.68627: dumping result to json 8250 1726773021.68631: done dumping result, returning 8250 1726773021.68636: done running TaskExecutor() for managed_node1/TASK: Try to pass a boolean value for sysctl value [0affffe7-6841-f581-0619-000000000006] 8250 1726773021.68644: sending task result for task 0affffe7-6841-f581-0619-000000000006 8250 1726773021.68688: done sending task result for task 0affffe7-6841-f581-0619-000000000006 8250 1726773021.68693: WORKER PROCESS EXITING 8208 1726773021.69040: no more pending results, returning what we have 8208 1726773021.69045: in VariableManager get_vars() 8208 1726773021.69074: Calling all_inventory to load vars for managed_node1 8208 1726773021.69076: Calling groups_inventory to load vars for managed_node1 8208 1726773021.69079: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.69089: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.69092: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.69095: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.69259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.69440: done with get_vars() 8208 1726773021.69448: variable 'ansible_search_path' from source: unknown 8208 1726773021.69573: variable 'omit' from source: magic vars 8208 1726773021.69597: variable 'omit' from source: magic vars 8208 1726773021.69612: variable 'omit' from source: magic vars 8208 1726773021.69616: we have included files to process 8208 1726773021.69617: generating all_blocks data 8208 1726773021.69618: done generating all_blocks data 8208 1726773021.69618: processing included file: fedora.linux_system_roles.kernel_settings 8208 1726773021.69640: in VariableManager get_vars() 8208 1726773021.69652: done with get_vars() 8208 1726773021.69726: in VariableManager get_vars() 8208 1726773021.69740: done with get_vars() 8208 1726773021.69784: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8208 1726773021.69949: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8208 1726773021.70012: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8208 1726773021.70146: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8208 1726773021.70941: in VariableManager get_vars() 8208 1726773021.70964: done with get_vars() 8208 1726773021.72384: in VariableManager get_vars() 8208 1726773021.72409: done with get_vars() 8208 1726773021.72566: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8208 1726773021.73297: iterating over new_blocks loaded from include file 8208 1726773021.73299: in VariableManager get_vars() 8208 1726773021.73317: done with get_vars() 8208 1726773021.73319: filtering new block on tags 8208 1726773021.73338: done filtering new block on tags 8208 1726773021.73341: in VariableManager get_vars() 8208 1726773021.73360: done with get_vars() 8208 1726773021.73362: filtering new block on tags 8208 1726773021.73381: done filtering new block on tags 8208 1726773021.73384: in VariableManager get_vars() 8208 1726773021.73409: done with get_vars() 8208 1726773021.73411: filtering new block on tags 8208 1726773021.73451: done filtering new block on tags 8208 1726773021.73457: in VariableManager get_vars() 8208 1726773021.73473: done with get_vars() 8208 1726773021.73474: filtering new block on tags 8208 1726773021.73492: done filtering new block on tags 8208 1726773021.73494: done iterating over new_blocks loaded from include file 8208 1726773021.73495: extending task lists for all hosts with included blocks 8208 1726773021.73588: done extending task lists 8208 1726773021.73590: done processing included files 8208 1726773021.73590: results queue empty 8208 1726773021.73591: checking for any_errors_fatal 8208 1726773021.73592: done checking for any_errors_fatal 8208 1726773021.73593: checking for max_fail_percentage 8208 1726773021.73594: done checking for max_fail_percentage 8208 1726773021.73594: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.73595: done checking to see if all hosts have failed 8208 1726773021.73596: getting the remaining hosts for this loop 8208 1726773021.73597: done getting the remaining hosts for this loop 8208 1726773021.73599: getting the next task for host managed_node1 8208 1726773021.73603: done getting next task for host managed_node1 8208 1726773021.73605: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8208 1726773021.73607: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.73615: getting variables 8208 1726773021.73616: in VariableManager get_vars() 8208 1726773021.73628: Calling all_inventory to load vars for managed_node1 8208 1726773021.73631: Calling groups_inventory to load vars for managed_node1 8208 1726773021.73633: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.73637: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.73639: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.73642: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.73807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.74001: done with get_vars() 8208 1726773021.74011: done getting variables 8208 1726773021.74077: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.065) 0:00:02.415 **** 8208 1726773021.74109: entering _queue_task() for managed_node1/fail 8208 1726773021.74111: Creating lock for fail 8208 1726773021.74361: worker is 1 (out of 1 available) 8208 1726773021.74375: exiting _queue_task() for managed_node1/fail 8208 1726773021.74387: done queuing things up, now waiting for results queue to drain 8208 1726773021.74389: waiting for pending results... 8251 1726773021.74668: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8251 1726773021.74799: in run() - task 0affffe7-6841-f581-0619-00000000002c 8251 1726773021.74816: variable 'ansible_search_path' from source: unknown 8251 1726773021.74821: variable 'ansible_search_path' from source: unknown 8251 1726773021.74858: calling self._execute() 8251 1726773021.74927: variable 'ansible_host' from source: host vars for 'managed_node1' 8251 1726773021.74936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8251 1726773021.74944: variable 'omit' from source: magic vars 8251 1726773021.75378: variable 'kernel_settings_sysctl' from source: include params 8251 1726773021.75399: variable '__kernel_settings_state_empty' from source: role '' all vars 8251 1726773021.75410: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8251 1726773021.75769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8251 1726773021.78177: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8251 1726773021.78261: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8251 1726773021.78300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8251 1726773021.78336: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8251 1726773021.78365: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8251 1726773021.78438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8251 1726773021.78472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8251 1726773021.78499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8251 1726773021.78540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8251 1726773021.78557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8251 1726773021.78611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8251 1726773021.78634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8251 1726773021.78660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8251 1726773021.78700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8251 1726773021.78713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8251 1726773021.78752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8251 1726773021.78779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8251 1726773021.78804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8251 1726773021.78841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8251 1726773021.78858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8251 1726773021.79162: variable 'kernel_settings_sysctl' from source: include params 8251 1726773021.79239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8251 1726773021.79400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8251 1726773021.79437: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8251 1726773021.79469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8251 1726773021.79499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8251 1726773021.79745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8251 1726773021.79771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8251 1726773021.79798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8251 1726773021.79824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8251 1726773021.79865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8251 1726773021.79888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8251 1726773021.79913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8251 1726773021.79939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8251 1726773021.79965: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): True 8251 1726773021.79974: variable 'omit' from source: magic vars 8251 1726773021.80018: variable 'omit' from source: magic vars 8251 1726773021.80050: variable 'omit' from source: magic vars 8251 1726773021.80079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8251 1726773021.80106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8251 1726773021.80123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8251 1726773021.80140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8251 1726773021.80151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8251 1726773021.80183: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8251 1726773021.80198: variable 'ansible_host' from source: host vars for 'managed_node1' 8251 1726773021.80202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8251 1726773021.80305: Set connection var ansible_shell_executable to /bin/sh 8251 1726773021.80311: Set connection var ansible_connection to ssh 8251 1726773021.80318: Set connection var ansible_module_compression to ZIP_DEFLATED 8251 1726773021.80327: Set connection var ansible_timeout to 10 8251 1726773021.80331: Set connection var ansible_shell_type to sh 8251 1726773021.80340: Set connection var ansible_pipelining to False 8251 1726773021.80367: variable 'ansible_shell_executable' from source: unknown 8251 1726773021.80373: variable 'ansible_connection' from source: unknown 8251 1726773021.80377: variable 'ansible_module_compression' from source: unknown 8251 1726773021.80380: variable 'ansible_shell_type' from source: unknown 8251 1726773021.80383: variable 'ansible_shell_executable' from source: unknown 8251 1726773021.80388: variable 'ansible_host' from source: host vars for 'managed_node1' 8251 1726773021.80392: variable 'ansible_pipelining' from source: unknown 8251 1726773021.80395: variable 'ansible_timeout' from source: unknown 8251 1726773021.80399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8251 1726773021.80487: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8251 1726773021.80499: variable 'omit' from source: magic vars 8251 1726773021.80505: starting attempt loop 8251 1726773021.80508: running the handler 8251 1726773021.80517: handler run complete 8251 1726773021.80544: attempt loop complete, returning result 8251 1726773021.80548: _execute() done 8251 1726773021.80551: dumping result to json 8251 1726773021.80557: done dumping result, returning 8251 1726773021.80564: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-f581-0619-00000000002c] 8251 1726773021.80570: sending task result for task 0affffe7-6841-f581-0619-00000000002c 8251 1726773021.80600: done sending task result for task 0affffe7-6841-f581-0619-00000000002c 8251 1726773021.80603: WORKER PROCESS EXITING 8208 1726773021.80963: marking managed_node1 as failed 8208 1726773021.80972: marking host managed_node1 failed, current state: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.80980: ^ failed state is now: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=5, fail_state=2, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.80982: getting the next task for host managed_node1 8208 1726773021.80988: done getting next task for host managed_node1 8208 1726773021.80991: ^ task is: TASK: Check for sysctl bool value error 8208 1726773021.80992: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False fatal: [managed_node1]: FAILED! => { "changed": false } MSG: Boolean values are not allowed for sysctl settings 8208 1726773021.81115: no more pending results, returning what we have 8208 1726773021.81118: results queue empty 8208 1726773021.81118: checking for any_errors_fatal 8208 1726773021.81122: done checking for any_errors_fatal 8208 1726773021.81122: checking for max_fail_percentage 8208 1726773021.81123: done checking for max_fail_percentage 8208 1726773021.81124: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.81124: done checking to see if all hosts have failed 8208 1726773021.81125: getting the remaining hosts for this loop 8208 1726773021.81127: done getting the remaining hosts for this loop 8208 1726773021.81130: getting the next task for host managed_node1 8208 1726773021.81133: done getting next task for host managed_node1 8208 1726773021.81134: ^ task is: TASK: Check for sysctl bool value error 8208 1726773021.81135: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=0, handlers=0, run_state=2, fail_state=2, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773021.81140: getting variables 8208 1726773021.81142: in VariableManager get_vars() 8208 1726773021.81173: Calling all_inventory to load vars for managed_node1 8208 1726773021.81175: Calling groups_inventory to load vars for managed_node1 8208 1726773021.81177: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.81186: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.81189: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.81191: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.81375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.81567: done with get_vars() 8208 1726773021.81577: done getting variables 8208 1726773021.81671: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Check for sysctl bool value error] *************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:25 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.075) 0:00:02.491 **** 8208 1726773021.81704: entering _queue_task() for managed_node1/assert 8208 1726773021.81705: Creating lock for assert 8208 1726773021.81930: worker is 1 (out of 1 available) 8208 1726773021.81943: exiting _queue_task() for managed_node1/assert 8208 1726773021.81952: done queuing things up, now waiting for results queue to drain 8208 1726773021.81956: waiting for pending results... 8252 1726773021.82166: running TaskExecutor() for managed_node1/TASK: Check for sysctl bool value error 8252 1726773021.82272: in run() - task 0affffe7-6841-f581-0619-000000000008 8252 1726773021.82293: variable 'ansible_search_path' from source: unknown 8252 1726773021.82328: calling self._execute() 8252 1726773021.82400: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726773021.82409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726773021.82417: variable 'omit' from source: magic vars 8252 1726773021.82521: variable 'omit' from source: magic vars 8252 1726773021.82553: variable 'omit' from source: magic vars 8252 1726773021.82587: variable 'omit' from source: magic vars 8252 1726773021.82625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8252 1726773021.82662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8252 1726773021.82684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8252 1726773021.82705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8252 1726773021.82718: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8252 1726773021.82749: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8252 1726773021.82758: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726773021.82764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726773021.82867: Set connection var ansible_shell_executable to /bin/sh 8252 1726773021.82873: Set connection var ansible_connection to ssh 8252 1726773021.82880: Set connection var ansible_module_compression to ZIP_DEFLATED 8252 1726773021.82889: Set connection var ansible_timeout to 10 8252 1726773021.82893: Set connection var ansible_shell_type to sh 8252 1726773021.82901: Set connection var ansible_pipelining to False 8252 1726773021.82923: variable 'ansible_shell_executable' from source: unknown 8252 1726773021.82928: variable 'ansible_connection' from source: unknown 8252 1726773021.82931: variable 'ansible_module_compression' from source: unknown 8252 1726773021.82934: variable 'ansible_shell_type' from source: unknown 8252 1726773021.82937: variable 'ansible_shell_executable' from source: unknown 8252 1726773021.82940: variable 'ansible_host' from source: host vars for 'managed_node1' 8252 1726773021.82943: variable 'ansible_pipelining' from source: unknown 8252 1726773021.82946: variable 'ansible_timeout' from source: unknown 8252 1726773021.82950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8252 1726773021.83081: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8252 1726773021.83189: variable 'omit' from source: magic vars 8252 1726773021.83197: starting attempt loop 8252 1726773021.83200: running the handler 8252 1726773021.83550: variable 'ansible_failed_result' from source: set_fact 8252 1726773021.83571: Evaluated conditional (ansible_failed_result.msg != 'UNREACH'): True 8252 1726773021.83578: handler run complete 8252 1726773021.83593: attempt loop complete, returning result 8252 1726773021.83596: _execute() done 8252 1726773021.83599: dumping result to json 8252 1726773021.83602: done dumping result, returning 8252 1726773021.83607: done running TaskExecutor() for managed_node1/TASK: Check for sysctl bool value error [0affffe7-6841-f581-0619-000000000008] 8252 1726773021.83613: sending task result for task 0affffe7-6841-f581-0619-000000000008 8252 1726773021.83638: done sending task result for task 0affffe7-6841-f581-0619-000000000008 8252 1726773021.83641: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 8208 1726773021.83989: no more pending results, returning what we have 8208 1726773021.83993: results queue empty 8208 1726773021.83993: checking for any_errors_fatal 8208 1726773021.83998: done checking for any_errors_fatal 8208 1726773021.83999: checking for max_fail_percentage 8208 1726773021.84000: done checking for max_fail_percentage 8208 1726773021.84001: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.84002: done checking to see if all hosts have failed 8208 1726773021.84002: getting the remaining hosts for this loop 8208 1726773021.84004: done getting the remaining hosts for this loop 8208 1726773021.84007: getting the next task for host managed_node1 8208 1726773021.84014: done getting next task for host managed_node1 8208 1726773021.84017: ^ task is: TASK: Cleanup 8208 1726773021.84019: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? True, did start at task? False 8208 1726773021.84022: getting variables 8208 1726773021.84023: in VariableManager get_vars() 8208 1726773021.84059: Calling all_inventory to load vars for managed_node1 8208 1726773021.84062: Calling groups_inventory to load vars for managed_node1 8208 1726773021.84064: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.84073: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.84076: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.84079: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.84247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.84445: done with get_vars() 8208 1726773021.84459: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:30 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.028) 0:00:02.519 **** 8208 1726773021.84547: entering _queue_task() for managed_node1/include_tasks 8208 1726773021.84549: Creating lock for include_tasks 8208 1726773021.84765: worker is 1 (out of 1 available) 8208 1726773021.84779: exiting _queue_task() for managed_node1/include_tasks 8208 1726773021.84791: done queuing things up, now waiting for results queue to drain 8208 1726773021.84793: waiting for pending results... 8253 1726773021.84981: running TaskExecutor() for managed_node1/TASK: Cleanup 8253 1726773021.85089: in run() - task 0affffe7-6841-f581-0619-000000000009 8253 1726773021.85106: variable 'ansible_search_path' from source: unknown 8253 1726773021.85139: calling self._execute() 8253 1726773021.85210: variable 'ansible_host' from source: host vars for 'managed_node1' 8253 1726773021.85219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8253 1726773021.85228: variable 'omit' from source: magic vars 8253 1726773021.85322: _execute() done 8253 1726773021.85390: dumping result to json 8253 1726773021.85397: done dumping result, returning 8253 1726773021.85402: done running TaskExecutor() for managed_node1/TASK: Cleanup [0affffe7-6841-f581-0619-000000000009] 8253 1726773021.85409: sending task result for task 0affffe7-6841-f581-0619-000000000009 8253 1726773021.85437: done sending task result for task 0affffe7-6841-f581-0619-000000000009 8253 1726773021.85440: WORKER PROCESS EXITING 8208 1726773021.85726: no more pending results, returning what we have 8208 1726773021.85729: in VariableManager get_vars() 8208 1726773021.85765: Calling all_inventory to load vars for managed_node1 8208 1726773021.85768: Calling groups_inventory to load vars for managed_node1 8208 1726773021.85770: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.85778: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.85781: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.85784: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.85979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.86158: done with get_vars() 8208 1726773021.86165: variable 'ansible_search_path' from source: unknown 8208 1726773021.86178: we have included files to process 8208 1726773021.86179: generating all_blocks data 8208 1726773021.86180: done generating all_blocks data 8208 1726773021.86188: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8208 1726773021.86189: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml 8208 1726773021.86192: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml for managed_node1 8208 1726773021.87144: done processing included file 8208 1726773021.87146: iterating over new_blocks loaded from include file 8208 1726773021.87147: in VariableManager get_vars() 8208 1726773021.87164: done with get_vars() 8208 1726773021.87165: filtering new block on tags 8208 1726773021.87179: done filtering new block on tags 8208 1726773021.87181: in VariableManager get_vars() 8208 1726773021.87195: done with get_vars() 8208 1726773021.87196: filtering new block on tags 8208 1726773021.87217: done filtering new block on tags 8208 1726773021.87219: done iterating over new_blocks loaded from include file 8208 1726773021.87220: extending task lists for all hosts with included blocks 8208 1726773021.88224: done extending task lists 8208 1726773021.88225: done processing included files 8208 1726773021.88226: results queue empty 8208 1726773021.88226: checking for any_errors_fatal 8208 1726773021.88228: done checking for any_errors_fatal 8208 1726773021.88229: checking for max_fail_percentage 8208 1726773021.88229: done checking for max_fail_percentage 8208 1726773021.88230: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.88230: done checking to see if all hosts have failed 8208 1726773021.88231: getting the remaining hosts for this loop 8208 1726773021.88231: done getting the remaining hosts for this loop 8208 1726773021.88233: getting the next task for host managed_node1 8208 1726773021.88236: done getting next task for host managed_node1 8208 1726773021.88237: ^ task is: TASK: Show current tuned profile settings 8208 1726773021.88239: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773021.88240: getting variables 8208 1726773021.88241: in VariableManager get_vars() 8208 1726773021.88250: Calling all_inventory to load vars for managed_node1 8208 1726773021.88251: Calling groups_inventory to load vars for managed_node1 8208 1726773021.88252: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.88259: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.88262: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.88265: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.88375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.88483: done with get_vars() 8208 1726773021.88492: done getting variables 8208 1726773021.88544: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current tuned profile settings] ************************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:2 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.040) 0:00:02.559 **** 8208 1726773021.88567: entering _queue_task() for managed_node1/command 8208 1726773021.88568: Creating lock for command 8208 1726773021.88769: worker is 1 (out of 1 available) 8208 1726773021.88782: exiting _queue_task() for managed_node1/command 8208 1726773021.88795: done queuing things up, now waiting for results queue to drain 8208 1726773021.88796: waiting for pending results... 8255 1726773021.88893: running TaskExecutor() for managed_node1/TASK: Show current tuned profile settings 8255 1726773021.88994: in run() - task 0affffe7-6841-f581-0619-000000000095 8255 1726773021.89010: variable 'ansible_search_path' from source: unknown 8255 1726773021.89015: variable 'ansible_search_path' from source: unknown 8255 1726773021.89043: calling self._execute() 8255 1726773021.89100: variable 'ansible_host' from source: host vars for 'managed_node1' 8255 1726773021.89108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8255 1726773021.89117: variable 'omit' from source: magic vars 8255 1726773021.89194: variable 'omit' from source: magic vars 8255 1726773021.89224: variable 'omit' from source: magic vars 8255 1726773021.89451: variable '__kernel_settings_profile_filename' from source: role '' exported vars 8255 1726773021.89509: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8255 1726773021.89572: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8255 1726773021.89638: done running TaskExecutor() for managed_node1/TASK: Show current tuned profile settings [0affffe7-6841-f581-0619-000000000095] 8255 1726773021.89648: sending task result for task 0affffe7-6841-f581-0619-000000000095 8255 1726773021.89674: done sending task result for task 0affffe7-6841-f581-0619-000000000095 8255 1726773021.89678: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => {} MSG: The task includes an option with an undefined variable. The error was: {{ __kernel_settings_profile_dir }}/tuned.conf: {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_dir }}/tuned.conf: {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined. {{ __kernel_settings_profile_parent }}/{{ __kernel_settings_tuned_profile }}: '__kernel_settings_profile_parent' is undefined. '__kernel_settings_profile_parent' is undefined The error appears to be in '/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml': line 2, column 3, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: --- - name: Show current tuned profile settings ^ here ...ignoring 8208 1726773021.89788: no more pending results, returning what we have 8208 1726773021.89790: results queue empty 8208 1726773021.89791: checking for any_errors_fatal 8208 1726773021.89795: done checking for any_errors_fatal 8208 1726773021.89796: checking for max_fail_percentage 8208 1726773021.89797: done checking for max_fail_percentage 8208 1726773021.89798: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.89798: done checking to see if all hosts have failed 8208 1726773021.89799: getting the remaining hosts for this loop 8208 1726773021.89800: done getting the remaining hosts for this loop 8208 1726773021.89803: getting the next task for host managed_node1 8208 1726773021.89808: done getting next task for host managed_node1 8208 1726773021.89812: ^ task is: TASK: Run role with purge to remove everything 8208 1726773021.89815: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773021.89817: getting variables 8208 1726773021.89819: in VariableManager get_vars() 8208 1726773021.89848: Calling all_inventory to load vars for managed_node1 8208 1726773021.89852: Calling groups_inventory to load vars for managed_node1 8208 1726773021.89856: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.89865: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.89867: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.89869: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.89975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.90108: done with get_vars() 8208 1726773021.90116: done getting variables TASK [Run role with purge to remove everything] ******************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:9 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.016) 0:00:02.576 **** 8208 1726773021.90191: entering _queue_task() for managed_node1/include_role 8208 1726773021.90386: worker is 1 (out of 1 available) 8208 1726773021.90399: exiting _queue_task() for managed_node1/include_role 8208 1726773021.90409: done queuing things up, now waiting for results queue to drain 8208 1726773021.90411: waiting for pending results... 8256 1726773021.90599: running TaskExecutor() for managed_node1/TASK: Run role with purge to remove everything 8256 1726773021.90735: in run() - task 0affffe7-6841-f581-0619-000000000097 8256 1726773021.90753: variable 'ansible_search_path' from source: unknown 8256 1726773021.90760: variable 'ansible_search_path' from source: unknown 8256 1726773021.90794: calling self._execute() 8256 1726773021.90863: variable 'ansible_host' from source: host vars for 'managed_node1' 8256 1726773021.90872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8256 1726773021.90881: variable 'omit' from source: magic vars 8256 1726773021.90976: _execute() done 8256 1726773021.90982: dumping result to json 8256 1726773021.90988: done dumping result, returning 8256 1726773021.90993: done running TaskExecutor() for managed_node1/TASK: Run role with purge to remove everything [0affffe7-6841-f581-0619-000000000097] 8256 1726773021.91001: sending task result for task 0affffe7-6841-f581-0619-000000000097 8256 1726773021.91035: done sending task result for task 0affffe7-6841-f581-0619-000000000097 8256 1726773021.91039: WORKER PROCESS EXITING 8208 1726773021.91273: no more pending results, returning what we have 8208 1726773021.91276: in VariableManager get_vars() 8208 1726773021.91313: Calling all_inventory to load vars for managed_node1 8208 1726773021.91315: Calling groups_inventory to load vars for managed_node1 8208 1726773021.91316: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.91323: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.91324: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.91326: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.91596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.91699: done with get_vars() 8208 1726773021.91703: variable 'ansible_search_path' from source: unknown 8208 1726773021.91704: variable 'ansible_search_path' from source: unknown 8208 1726773021.91883: variable 'omit' from source: magic vars 8208 1726773021.91908: variable 'omit' from source: magic vars 8208 1726773021.91918: variable 'omit' from source: magic vars 8208 1726773021.91920: we have included files to process 8208 1726773021.91921: generating all_blocks data 8208 1726773021.91921: done generating all_blocks data 8208 1726773021.91923: processing included file: fedora.linux_system_roles.kernel_settings 8208 1726773021.91938: in VariableManager get_vars() 8208 1726773021.91948: done with get_vars() 8208 1726773021.91968: in VariableManager get_vars() 8208 1726773021.91979: done with get_vars() 8208 1726773021.92008: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/main.yml 8208 1726773021.92043: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/defaults/main.yml 8208 1726773021.92060: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/meta/main.yml 8208 1726773021.92106: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml 8208 1726773021.92428: in VariableManager get_vars() 8208 1726773021.92442: done with get_vars() 8208 1726773021.93271: in VariableManager get_vars() 8208 1726773021.93289: done with get_vars() 8208 1726773021.93440: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/handlers/main.yml 8208 1726773021.93849: iterating over new_blocks loaded from include file 8208 1726773021.93850: in VariableManager get_vars() 8208 1726773021.93865: done with get_vars() 8208 1726773021.93866: filtering new block on tags 8208 1726773021.93878: done filtering new block on tags 8208 1726773021.93880: in VariableManager get_vars() 8208 1726773021.93907: done with get_vars() 8208 1726773021.93908: filtering new block on tags 8208 1726773021.93921: done filtering new block on tags 8208 1726773021.93922: in VariableManager get_vars() 8208 1726773021.93933: done with get_vars() 8208 1726773021.93934: filtering new block on tags 8208 1726773021.93962: done filtering new block on tags 8208 1726773021.93964: in VariableManager get_vars() 8208 1726773021.93974: done with get_vars() 8208 1726773021.93975: filtering new block on tags 8208 1726773021.93984: done filtering new block on tags 8208 1726773021.93987: done iterating over new_blocks loaded from include file 8208 1726773021.93988: extending task lists for all hosts with included blocks 8208 1726773021.94157: done extending task lists 8208 1726773021.94158: done processing included files 8208 1726773021.94158: results queue empty 8208 1726773021.94158: checking for any_errors_fatal 8208 1726773021.94160: done checking for any_errors_fatal 8208 1726773021.94161: checking for max_fail_percentage 8208 1726773021.94161: done checking for max_fail_percentage 8208 1726773021.94162: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.94162: done checking to see if all hosts have failed 8208 1726773021.94163: getting the remaining hosts for this loop 8208 1726773021.94163: done getting the remaining hosts for this loop 8208 1726773021.94165: getting the next task for host managed_node1 8208 1726773021.94168: done getting next task for host managed_node1 8208 1726773021.94169: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8208 1726773021.94171: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773021.94177: getting variables 8208 1726773021.94178: in VariableManager get_vars() 8208 1726773021.94188: Calling all_inventory to load vars for managed_node1 8208 1726773021.94190: Calling groups_inventory to load vars for managed_node1 8208 1726773021.94191: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.94195: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.94196: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.94197: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.94279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.94393: done with get_vars() 8208 1726773021.94400: done getting variables 8208 1726773021.94424: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:2 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.042) 0:00:02.618 **** 8208 1726773021.94451: entering _queue_task() for managed_node1/fail 8208 1726773021.94649: worker is 1 (out of 1 available) 8208 1726773021.94664: exiting _queue_task() for managed_node1/fail 8208 1726773021.94674: done queuing things up, now waiting for results queue to drain 8208 1726773021.94677: waiting for pending results... 8259 1726773021.94776: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values 8259 1726773021.94908: in run() - task 0affffe7-6841-f581-0619-00000000013d 8259 1726773021.94926: variable 'ansible_search_path' from source: unknown 8259 1726773021.94930: variable 'ansible_search_path' from source: unknown 8259 1726773021.94959: calling self._execute() 8259 1726773021.95056: variable 'ansible_host' from source: host vars for 'managed_node1' 8259 1726773021.95066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8259 1726773021.95074: variable 'omit' from source: magic vars 8259 1726773021.95479: variable 'kernel_settings_sysctl' from source: include params 8259 1726773021.95492: variable '__kernel_settings_state_empty' from source: role '' all vars 8259 1726773021.95504: Evaluated conditional (kernel_settings_sysctl != __kernel_settings_state_empty): True 8259 1726773021.95817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8259 1726773021.97676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8259 1726773021.97738: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8259 1726773021.97768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8259 1726773021.97798: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8259 1726773021.97820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8259 1726773021.97877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8259 1726773021.97900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8259 1726773021.97919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8259 1726773021.97947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8259 1726773021.97960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8259 1726773021.97999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8259 1726773021.98016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8259 1726773021.98035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8259 1726773021.98061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8259 1726773021.98072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8259 1726773021.98102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8259 1726773021.98119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8259 1726773021.98135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8259 1726773021.98162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8259 1726773021.98174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8259 1726773021.98356: variable 'kernel_settings_sysctl' from source: include params 8259 1726773021.98380: Evaluated conditional ((kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", true) | list | length > 0) or (kernel_settings_sysctl | selectattr("value", "defined") | selectattr("value", "sameas", false) | list | length > 0)): False 8259 1726773021.98387: when evaluation is False, skipping this task 8259 1726773021.98391: _execute() done 8259 1726773021.98395: dumping result to json 8259 1726773021.98399: done dumping result, returning 8259 1726773021.98405: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check sysctl settings for boolean values [0affffe7-6841-f581-0619-00000000013d] 8259 1726773021.98411: sending task result for task 0affffe7-6841-f581-0619-00000000013d 8259 1726773021.98434: done sending task result for task 0affffe7-6841-f581-0619-00000000013d 8259 1726773021.98437: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "(kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", true) | list | length > 0) or (kernel_settings_sysctl | selectattr(\"value\", \"defined\") | selectattr(\"value\", \"sameas\", false) | list | length > 0)", "skip_reason": "Conditional result was False" } 8208 1726773021.98602: no more pending results, returning what we have 8208 1726773021.98605: results queue empty 8208 1726773021.98606: checking for any_errors_fatal 8208 1726773021.98608: done checking for any_errors_fatal 8208 1726773021.98608: checking for max_fail_percentage 8208 1726773021.98610: done checking for max_fail_percentage 8208 1726773021.98610: checking to see if all hosts have failed and the running result is not ok 8208 1726773021.98611: done checking to see if all hosts have failed 8208 1726773021.98611: getting the remaining hosts for this loop 8208 1726773021.98613: done getting the remaining hosts for this loop 8208 1726773021.98617: getting the next task for host managed_node1 8208 1726773021.98622: done getting next task for host managed_node1 8208 1726773021.98626: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8208 1726773021.98629: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773021.98646: getting variables 8208 1726773021.98647: in VariableManager get_vars() 8208 1726773021.98675: Calling all_inventory to load vars for managed_node1 8208 1726773021.98677: Calling groups_inventory to load vars for managed_node1 8208 1726773021.98679: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.98688: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.98690: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.98691: Calling groups_plugins_play to load vars for managed_node1 8208 1726773021.98792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773021.98907: done with get_vars() 8208 1726773021.98915: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Set version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:9 Thursday 19 September 2024 15:10:21 -0400 (0:00:00.045) 0:00:02.664 **** 8208 1726773021.98980: entering _queue_task() for managed_node1/include_tasks 8208 1726773021.99149: worker is 1 (out of 1 available) 8208 1726773021.99166: exiting _queue_task() for managed_node1/include_tasks 8208 1726773021.99176: done queuing things up, now waiting for results queue to drain 8208 1726773021.99178: waiting for pending results... 8262 1726773021.99301: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables 8262 1726773021.99425: in run() - task 0affffe7-6841-f581-0619-00000000013e 8262 1726773021.99446: variable 'ansible_search_path' from source: unknown 8262 1726773021.99451: variable 'ansible_search_path' from source: unknown 8262 1726773021.99480: calling self._execute() 8262 1726773021.99537: variable 'ansible_host' from source: host vars for 'managed_node1' 8262 1726773021.99546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8262 1726773021.99555: variable 'omit' from source: magic vars 8262 1726773021.99626: _execute() done 8262 1726773021.99632: dumping result to json 8262 1726773021.99637: done dumping result, returning 8262 1726773021.99642: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set version specific variables [0affffe7-6841-f581-0619-00000000013e] 8262 1726773021.99651: sending task result for task 0affffe7-6841-f581-0619-00000000013e 8262 1726773021.99675: done sending task result for task 0affffe7-6841-f581-0619-00000000013e 8262 1726773021.99678: WORKER PROCESS EXITING 8208 1726773021.99801: no more pending results, returning what we have 8208 1726773021.99805: in VariableManager get_vars() 8208 1726773021.99840: Calling all_inventory to load vars for managed_node1 8208 1726773021.99843: Calling groups_inventory to load vars for managed_node1 8208 1726773021.99845: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773021.99852: Calling all_plugins_play to load vars for managed_node1 8208 1726773021.99858: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773021.99861: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.00022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.00214: done with get_vars() 8208 1726773022.00226: variable 'ansible_search_path' from source: unknown 8208 1726773022.00227: variable 'ansible_search_path' from source: unknown 8208 1726773022.00263: we have included files to process 8208 1726773022.00264: generating all_blocks data 8208 1726773022.00266: done generating all_blocks data 8208 1726773022.00270: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8208 1726773022.00272: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml 8208 1726773022.00274: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml for managed_node1 8208 1726773022.01105: done processing included file 8208 1726773022.01107: iterating over new_blocks loaded from include file 8208 1726773022.01108: in VariableManager get_vars() 8208 1726773022.01132: done with get_vars() 8208 1726773022.01133: filtering new block on tags 8208 1726773022.01148: done filtering new block on tags 8208 1726773022.01151: in VariableManager get_vars() 8208 1726773022.01177: done with get_vars() 8208 1726773022.01179: filtering new block on tags 8208 1726773022.01202: done filtering new block on tags 8208 1726773022.01205: in VariableManager get_vars() 8208 1726773022.01227: done with get_vars() 8208 1726773022.01230: filtering new block on tags 8208 1726773022.01248: done filtering new block on tags 8208 1726773022.01250: in VariableManager get_vars() 8208 1726773022.01271: done with get_vars() 8208 1726773022.01272: filtering new block on tags 8208 1726773022.01287: done filtering new block on tags 8208 1726773022.01289: done iterating over new_blocks loaded from include file 8208 1726773022.01290: extending task lists for all hosts with included blocks 8208 1726773022.01554: done extending task lists 8208 1726773022.01556: done processing included files 8208 1726773022.01556: results queue empty 8208 1726773022.01557: checking for any_errors_fatal 8208 1726773022.01560: done checking for any_errors_fatal 8208 1726773022.01561: checking for max_fail_percentage 8208 1726773022.01562: done checking for max_fail_percentage 8208 1726773022.01562: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.01563: done checking to see if all hosts have failed 8208 1726773022.01563: getting the remaining hosts for this loop 8208 1726773022.01564: done getting the remaining hosts for this loop 8208 1726773022.01567: getting the next task for host managed_node1 8208 1726773022.01571: done getting next task for host managed_node1 8208 1726773022.01573: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8208 1726773022.01576: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.01586: getting variables 8208 1726773022.01587: in VariableManager get_vars() 8208 1726773022.01599: Calling all_inventory to load vars for managed_node1 8208 1726773022.01601: Calling groups_inventory to load vars for managed_node1 8208 1726773022.01603: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.01607: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.01609: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.01611: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.01744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.01939: done with get_vars() 8208 1726773022.01947: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:2 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.030) 0:00:02.694 **** 8208 1726773022.02013: entering _queue_task() for managed_node1/setup 8208 1726773022.02215: worker is 1 (out of 1 available) 8208 1726773022.02228: exiting _queue_task() for managed_node1/setup 8208 1726773022.02239: done queuing things up, now waiting for results queue to drain 8208 1726773022.02242: waiting for pending results... 8263 1726773022.02438: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role 8263 1726773022.02606: in run() - task 0affffe7-6841-f581-0619-0000000001b9 8263 1726773022.02626: variable 'ansible_search_path' from source: unknown 8263 1726773022.02631: variable 'ansible_search_path' from source: unknown 8263 1726773022.02663: calling self._execute() 8263 1726773022.02732: variable 'ansible_host' from source: host vars for 'managed_node1' 8263 1726773022.02742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8263 1726773022.02751: variable 'omit' from source: magic vars 8263 1726773022.03219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8263 1726773022.04739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8263 1726773022.04792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8263 1726773022.04821: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8263 1726773022.04856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8263 1726773022.04880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8263 1726773022.04954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8263 1726773022.04982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8263 1726773022.05007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8263 1726773022.05044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8263 1726773022.05059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8263 1726773022.05111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8263 1726773022.05134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8263 1726773022.05157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8263 1726773022.05186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8263 1726773022.05198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8263 1726773022.05319: variable '__kernel_settings_required_facts' from source: role '' all vars 8263 1726773022.05331: variable 'ansible_facts' from source: unknown 8263 1726773022.05393: Evaluated conditional (__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 8263 1726773022.05399: when evaluation is False, skipping this task 8263 1726773022.05403: _execute() done 8263 1726773022.05407: dumping result to json 8263 1726773022.05410: done dumping result, returning 8263 1726773022.05417: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure ansible_facts used by role [0affffe7-6841-f581-0619-0000000001b9] 8263 1726773022.05424: sending task result for task 0affffe7-6841-f581-0619-0000000001b9 8263 1726773022.05446: done sending task result for task 0affffe7-6841-f581-0619-0000000001b9 8263 1726773022.05449: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } 8208 1726773022.05558: no more pending results, returning what we have 8208 1726773022.05561: results queue empty 8208 1726773022.05562: checking for any_errors_fatal 8208 1726773022.05563: done checking for any_errors_fatal 8208 1726773022.05564: checking for max_fail_percentage 8208 1726773022.05565: done checking for max_fail_percentage 8208 1726773022.05566: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.05566: done checking to see if all hosts have failed 8208 1726773022.05567: getting the remaining hosts for this loop 8208 1726773022.05568: done getting the remaining hosts for this loop 8208 1726773022.05571: getting the next task for host managed_node1 8208 1726773022.05579: done getting next task for host managed_node1 8208 1726773022.05583: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8208 1726773022.05589: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.05603: getting variables 8208 1726773022.05604: in VariableManager get_vars() 8208 1726773022.05637: Calling all_inventory to load vars for managed_node1 8208 1726773022.05640: Calling groups_inventory to load vars for managed_node1 8208 1726773022.05641: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.05649: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.05651: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.05654: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.05823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.06065: done with get_vars() 8208 1726773022.06076: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.041) 0:00:02.736 **** 8208 1726773022.06186: entering _queue_task() for managed_node1/stat 8208 1726773022.06382: worker is 1 (out of 1 available) 8208 1726773022.06402: exiting _queue_task() for managed_node1/stat 8208 1726773022.06413: done queuing things up, now waiting for results queue to drain 8208 1726773022.06415: waiting for pending results... 8266 1726773022.06523: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree 8266 1726773022.06647: in run() - task 0affffe7-6841-f581-0619-0000000001bb 8266 1726773022.06664: variable 'ansible_search_path' from source: unknown 8266 1726773022.06669: variable 'ansible_search_path' from source: unknown 8266 1726773022.06699: calling self._execute() 8266 1726773022.06756: variable 'ansible_host' from source: host vars for 'managed_node1' 8266 1726773022.06765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8266 1726773022.06774: variable 'omit' from source: magic vars 8266 1726773022.07099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8266 1726773022.07274: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8266 1726773022.07311: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8266 1726773022.07336: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8266 1726773022.07364: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8266 1726773022.07425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8266 1726773022.07445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8266 1726773022.07465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8266 1726773022.07488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8266 1726773022.07580: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8266 1726773022.07590: variable 'omit' from source: magic vars 8266 1726773022.07635: variable 'omit' from source: magic vars 8266 1726773022.07659: variable 'omit' from source: magic vars 8266 1726773022.07680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8266 1726773022.07704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8266 1726773022.07720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8266 1726773022.07735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8266 1726773022.07745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8266 1726773022.07768: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8266 1726773022.07774: variable 'ansible_host' from source: host vars for 'managed_node1' 8266 1726773022.07778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8266 1726773022.07845: Set connection var ansible_shell_executable to /bin/sh 8266 1726773022.07851: Set connection var ansible_connection to ssh 8266 1726773022.07857: Set connection var ansible_module_compression to ZIP_DEFLATED 8266 1726773022.07865: Set connection var ansible_timeout to 10 8266 1726773022.07868: Set connection var ansible_shell_type to sh 8266 1726773022.07875: Set connection var ansible_pipelining to False 8266 1726773022.07893: variable 'ansible_shell_executable' from source: unknown 8266 1726773022.07897: variable 'ansible_connection' from source: unknown 8266 1726773022.07901: variable 'ansible_module_compression' from source: unknown 8266 1726773022.07904: variable 'ansible_shell_type' from source: unknown 8266 1726773022.07907: variable 'ansible_shell_executable' from source: unknown 8266 1726773022.07910: variable 'ansible_host' from source: host vars for 'managed_node1' 8266 1726773022.07914: variable 'ansible_pipelining' from source: unknown 8266 1726773022.07917: variable 'ansible_timeout' from source: unknown 8266 1726773022.07921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8266 1726773022.08013: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8266 1726773022.08024: variable 'omit' from source: magic vars 8266 1726773022.08030: starting attempt loop 8266 1726773022.08033: running the handler 8266 1726773022.08044: _low_level_execute_command(): starting 8266 1726773022.08052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8266 1726773022.10514: stdout chunk (state=2): >>>/root <<< 8266 1726773022.10630: stderr chunk (state=3): >>><<< 8266 1726773022.10638: stdout chunk (state=3): >>><<< 8266 1726773022.10658: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8266 1726773022.10672: _low_level_execute_command(): starting 8266 1726773022.10678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856 `" && echo ansible-tmp-1726773022.1066701-8266-106774012352856="` echo /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856 `" ) && sleep 0' 8266 1726773022.13262: stdout chunk (state=2): >>>ansible-tmp-1726773022.1066701-8266-106774012352856=/root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856 <<< 8266 1726773022.13390: stderr chunk (state=3): >>><<< 8266 1726773022.13398: stdout chunk (state=3): >>><<< 8266 1726773022.13416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773022.1066701-8266-106774012352856=/root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856 , stderr= 8266 1726773022.13453: variable 'ansible_module_compression' from source: unknown 8266 1726773022.13504: ANSIBALLZ: Using lock for stat 8266 1726773022.13509: ANSIBALLZ: Acquiring lock 8266 1726773022.13512: ANSIBALLZ: Lock acquired: 139627422453680 8266 1726773022.13516: ANSIBALLZ: Creating module 8266 1726773022.21852: ANSIBALLZ: Writing module into payload 8266 1726773022.21937: ANSIBALLZ: Writing module 8266 1726773022.21958: ANSIBALLZ: Renaming module 8266 1726773022.21965: ANSIBALLZ: Done creating module 8266 1726773022.21980: variable 'ansible_facts' from source: unknown 8266 1726773022.22038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/AnsiballZ_stat.py 8266 1726773022.22148: Sending initial data 8266 1726773022.22159: Sent initial data (151 bytes) 8266 1726773022.24913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp5ki7kx24 /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/AnsiballZ_stat.py <<< 8266 1726773022.26334: stderr chunk (state=3): >>><<< 8266 1726773022.26344: stdout chunk (state=3): >>><<< 8266 1726773022.26366: done transferring module to remote 8266 1726773022.26379: _low_level_execute_command(): starting 8266 1726773022.26387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/ /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/AnsiballZ_stat.py && sleep 0' 8266 1726773022.28864: stderr chunk (state=2): >>><<< 8266 1726773022.28874: stdout chunk (state=2): >>><<< 8266 1726773022.28890: _low_level_execute_command() done: rc=0, stdout=, stderr= 8266 1726773022.28895: _low_level_execute_command(): starting 8266 1726773022.28900: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/AnsiballZ_stat.py && sleep 0' 8266 1726773022.43680: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8266 1726773022.44665: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8266 1726773022.44711: stderr chunk (state=3): >>><<< 8266 1726773022.44718: stdout chunk (state=3): >>><<< 8266 1726773022.44735: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.43.7 closed. 8266 1726773022.44758: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8266 1726773022.44770: _low_level_execute_command(): starting 8266 1726773022.44776: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773022.1066701-8266-106774012352856/ > /dev/null 2>&1 && sleep 0' 8266 1726773022.47311: stderr chunk (state=2): >>><<< 8266 1726773022.47322: stdout chunk (state=2): >>><<< 8266 1726773022.47338: _low_level_execute_command() done: rc=0, stdout=, stderr= 8266 1726773022.47345: handler run complete 8266 1726773022.47361: attempt loop complete, returning result 8266 1726773022.47365: _execute() done 8266 1726773022.47369: dumping result to json 8266 1726773022.47373: done dumping result, returning 8266 1726773022.47381: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if system is ostree [0affffe7-6841-f581-0619-0000000001bb] 8266 1726773022.47387: sending task result for task 0affffe7-6841-f581-0619-0000000001bb 8266 1726773022.47414: done sending task result for task 0affffe7-6841-f581-0619-0000000001bb 8266 1726773022.47417: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 8208 1726773022.47552: no more pending results, returning what we have 8208 1726773022.47557: results queue empty 8208 1726773022.47558: checking for any_errors_fatal 8208 1726773022.47564: done checking for any_errors_fatal 8208 1726773022.47565: checking for max_fail_percentage 8208 1726773022.47566: done checking for max_fail_percentage 8208 1726773022.47567: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.47568: done checking to see if all hosts have failed 8208 1726773022.47568: getting the remaining hosts for this loop 8208 1726773022.47569: done getting the remaining hosts for this loop 8208 1726773022.47572: getting the next task for host managed_node1 8208 1726773022.47578: done getting next task for host managed_node1 8208 1726773022.47581: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8208 1726773022.47587: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.47597: getting variables 8208 1726773022.47599: in VariableManager get_vars() 8208 1726773022.47631: Calling all_inventory to load vars for managed_node1 8208 1726773022.47634: Calling groups_inventory to load vars for managed_node1 8208 1726773022.47635: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.47643: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.47645: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.47647: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.47771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.47892: done with get_vars() 8208 1726773022.47900: done getting variables 8208 1726773022.47972: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:15 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.418) 0:00:03.154 **** 8208 1726773022.48000: entering _queue_task() for managed_node1/set_fact 8208 1726773022.48001: Creating lock for set_fact 8208 1726773022.48175: worker is 1 (out of 1 available) 8208 1726773022.48191: exiting _queue_task() for managed_node1/set_fact 8208 1726773022.48202: done queuing things up, now waiting for results queue to drain 8208 1726773022.48204: waiting for pending results... 8274 1726773022.48304: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree 8274 1726773022.48426: in run() - task 0affffe7-6841-f581-0619-0000000001bc 8274 1726773022.48441: variable 'ansible_search_path' from source: unknown 8274 1726773022.48445: variable 'ansible_search_path' from source: unknown 8274 1726773022.48472: calling self._execute() 8274 1726773022.48528: variable 'ansible_host' from source: host vars for 'managed_node1' 8274 1726773022.48537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8274 1726773022.48546: variable 'omit' from source: magic vars 8274 1726773022.48866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8274 1726773022.49084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8274 1726773022.49117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8274 1726773022.49142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8274 1726773022.49170: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8274 1726773022.49231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8274 1726773022.49250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8274 1726773022.49271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8274 1726773022.49293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8274 1726773022.49376: Evaluated conditional (not __kernel_settings_is_ostree is defined): True 8274 1726773022.49387: variable 'omit' from source: magic vars 8274 1726773022.49430: variable 'omit' from source: magic vars 8274 1726773022.49509: variable '__ostree_booted_stat' from source: set_fact 8274 1726773022.49546: variable 'omit' from source: magic vars 8274 1726773022.49568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8274 1726773022.49591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8274 1726773022.49606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8274 1726773022.49621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8274 1726773022.49630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8274 1726773022.49652: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8274 1726773022.49657: variable 'ansible_host' from source: host vars for 'managed_node1' 8274 1726773022.49661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8274 1726773022.49727: Set connection var ansible_shell_executable to /bin/sh 8274 1726773022.49733: Set connection var ansible_connection to ssh 8274 1726773022.49739: Set connection var ansible_module_compression to ZIP_DEFLATED 8274 1726773022.49746: Set connection var ansible_timeout to 10 8274 1726773022.49749: Set connection var ansible_shell_type to sh 8274 1726773022.49756: Set connection var ansible_pipelining to False 8274 1726773022.49772: variable 'ansible_shell_executable' from source: unknown 8274 1726773022.49776: variable 'ansible_connection' from source: unknown 8274 1726773022.49780: variable 'ansible_module_compression' from source: unknown 8274 1726773022.49783: variable 'ansible_shell_type' from source: unknown 8274 1726773022.49788: variable 'ansible_shell_executable' from source: unknown 8274 1726773022.49791: variable 'ansible_host' from source: host vars for 'managed_node1' 8274 1726773022.49795: variable 'ansible_pipelining' from source: unknown 8274 1726773022.49799: variable 'ansible_timeout' from source: unknown 8274 1726773022.49803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8274 1726773022.49863: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8274 1726773022.49873: variable 'omit' from source: magic vars 8274 1726773022.49876: starting attempt loop 8274 1726773022.49878: running the handler 8274 1726773022.49886: handler run complete 8274 1726773022.49892: attempt loop complete, returning result 8274 1726773022.49894: _execute() done 8274 1726773022.49896: dumping result to json 8274 1726773022.49898: done dumping result, returning 8274 1726773022.49901: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate system is ostree [0affffe7-6841-f581-0619-0000000001bc] 8274 1726773022.49905: sending task result for task 0affffe7-6841-f581-0619-0000000001bc 8274 1726773022.49922: done sending task result for task 0affffe7-6841-f581-0619-0000000001bc 8274 1726773022.49924: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_is_ostree": false }, "changed": false } 8208 1726773022.50198: no more pending results, returning what we have 8208 1726773022.50200: results queue empty 8208 1726773022.50201: checking for any_errors_fatal 8208 1726773022.50204: done checking for any_errors_fatal 8208 1726773022.50204: checking for max_fail_percentage 8208 1726773022.50205: done checking for max_fail_percentage 8208 1726773022.50206: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.50206: done checking to see if all hosts have failed 8208 1726773022.50206: getting the remaining hosts for this loop 8208 1726773022.50207: done getting the remaining hosts for this loop 8208 1726773022.50209: getting the next task for host managed_node1 8208 1726773022.50216: done getting next task for host managed_node1 8208 1726773022.50219: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8208 1726773022.50222: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.50232: getting variables 8208 1726773022.50233: in VariableManager get_vars() 8208 1726773022.50261: Calling all_inventory to load vars for managed_node1 8208 1726773022.50262: Calling groups_inventory to load vars for managed_node1 8208 1726773022.50264: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.50269: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.50271: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.50272: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.50404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.50520: done with get_vars() 8208 1726773022.50527: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:22 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.025) 0:00:03.180 **** 8208 1726773022.50595: entering _queue_task() for managed_node1/stat 8208 1726773022.50751: worker is 1 (out of 1 available) 8208 1726773022.50767: exiting _queue_task() for managed_node1/stat 8208 1726773022.50777: done queuing things up, now waiting for results queue to drain 8208 1726773022.50779: waiting for pending results... 8275 1726773022.50877: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin 8275 1726773022.50994: in run() - task 0affffe7-6841-f581-0619-0000000001be 8275 1726773022.51011: variable 'ansible_search_path' from source: unknown 8275 1726773022.51015: variable 'ansible_search_path' from source: unknown 8275 1726773022.51042: calling self._execute() 8275 1726773022.51096: variable 'ansible_host' from source: host vars for 'managed_node1' 8275 1726773022.51104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8275 1726773022.51113: variable 'omit' from source: magic vars 8275 1726773022.51427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8275 1726773022.51597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8275 1726773022.51629: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8275 1726773022.51655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8275 1726773022.51683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8275 1726773022.51741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8275 1726773022.51763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8275 1726773022.51783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8275 1726773022.51803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8275 1726773022.51889: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8275 1726773022.51899: variable 'omit' from source: magic vars 8275 1726773022.51941: variable 'omit' from source: magic vars 8275 1726773022.51963: variable 'omit' from source: magic vars 8275 1726773022.51984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8275 1726773022.52007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8275 1726773022.52022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8275 1726773022.52035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8275 1726773022.52045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8275 1726773022.52067: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8275 1726773022.52073: variable 'ansible_host' from source: host vars for 'managed_node1' 8275 1726773022.52077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8275 1726773022.52141: Set connection var ansible_shell_executable to /bin/sh 8275 1726773022.52146: Set connection var ansible_connection to ssh 8275 1726773022.52153: Set connection var ansible_module_compression to ZIP_DEFLATED 8275 1726773022.52161: Set connection var ansible_timeout to 10 8275 1726773022.52164: Set connection var ansible_shell_type to sh 8275 1726773022.52171: Set connection var ansible_pipelining to False 8275 1726773022.52189: variable 'ansible_shell_executable' from source: unknown 8275 1726773022.52193: variable 'ansible_connection' from source: unknown 8275 1726773022.52196: variable 'ansible_module_compression' from source: unknown 8275 1726773022.52199: variable 'ansible_shell_type' from source: unknown 8275 1726773022.52202: variable 'ansible_shell_executable' from source: unknown 8275 1726773022.52205: variable 'ansible_host' from source: host vars for 'managed_node1' 8275 1726773022.52210: variable 'ansible_pipelining' from source: unknown 8275 1726773022.52212: variable 'ansible_timeout' from source: unknown 8275 1726773022.52214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8275 1726773022.52306: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8275 1726773022.52317: variable 'omit' from source: magic vars 8275 1726773022.52323: starting attempt loop 8275 1726773022.52327: running the handler 8275 1726773022.52338: _low_level_execute_command(): starting 8275 1726773022.52345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8275 1726773022.54761: stdout chunk (state=2): >>>/root <<< 8275 1726773022.54874: stderr chunk (state=3): >>><<< 8275 1726773022.54882: stdout chunk (state=3): >>><<< 8275 1726773022.54902: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8275 1726773022.54916: _low_level_execute_command(): starting 8275 1726773022.54922: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334 `" && echo ansible-tmp-1726773022.5491138-8275-201100370733334="` echo /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334 `" ) && sleep 0' 8275 1726773022.57433: stdout chunk (state=2): >>>ansible-tmp-1726773022.5491138-8275-201100370733334=/root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334 <<< 8275 1726773022.57561: stderr chunk (state=3): >>><<< 8275 1726773022.57569: stdout chunk (state=3): >>><<< 8275 1726773022.57584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773022.5491138-8275-201100370733334=/root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334 , stderr= 8275 1726773022.57622: variable 'ansible_module_compression' from source: unknown 8275 1726773022.57673: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8275 1726773022.57705: variable 'ansible_facts' from source: unknown 8275 1726773022.57774: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/AnsiballZ_stat.py 8275 1726773022.57877: Sending initial data 8275 1726773022.57884: Sent initial data (151 bytes) 8275 1726773022.60528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpmnfmh15a /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/AnsiballZ_stat.py <<< 8275 1726773022.61948: stderr chunk (state=3): >>><<< 8275 1726773022.61966: stdout chunk (state=3): >>><<< 8275 1726773022.61983: done transferring module to remote 8275 1726773022.61996: _low_level_execute_command(): starting 8275 1726773022.62001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/ /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/AnsiballZ_stat.py && sleep 0' 8275 1726773022.64444: stderr chunk (state=2): >>><<< 8275 1726773022.64453: stdout chunk (state=2): >>><<< 8275 1726773022.64468: _low_level_execute_command() done: rc=0, stdout=, stderr= 8275 1726773022.64472: _low_level_execute_command(): starting 8275 1726773022.64478: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/AnsiballZ_stat.py && sleep 0' 8275 1726773022.79228: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8275 1726773022.80224: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8275 1726773022.80273: stderr chunk (state=3): >>><<< 8275 1726773022.80281: stdout chunk (state=3): >>><<< 8275 1726773022.80298: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/sbin/transactional-update", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.43.7 closed. 8275 1726773022.80356: done with _execute_module (stat, {'path': '/sbin/transactional-update', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8275 1726773022.80369: _low_level_execute_command(): starting 8275 1726773022.80375: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773022.5491138-8275-201100370733334/ > /dev/null 2>&1 && sleep 0' 8275 1726773022.82913: stderr chunk (state=2): >>><<< 8275 1726773022.82924: stdout chunk (state=2): >>><<< 8275 1726773022.82940: _low_level_execute_command() done: rc=0, stdout=, stderr= 8275 1726773022.82950: handler run complete 8275 1726773022.82967: attempt loop complete, returning result 8275 1726773022.82971: _execute() done 8275 1726773022.82975: dumping result to json 8275 1726773022.82979: done dumping result, returning 8275 1726773022.82988: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check if transactional-update exists in /sbin [0affffe7-6841-f581-0619-0000000001be] 8275 1726773022.82994: sending task result for task 0affffe7-6841-f581-0619-0000000001be 8275 1726773022.83023: done sending task result for task 0affffe7-6841-f581-0619-0000000001be 8275 1726773022.83026: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 8208 1726773022.83165: no more pending results, returning what we have 8208 1726773022.83168: results queue empty 8208 1726773022.83169: checking for any_errors_fatal 8208 1726773022.83173: done checking for any_errors_fatal 8208 1726773022.83174: checking for max_fail_percentage 8208 1726773022.83175: done checking for max_fail_percentage 8208 1726773022.83175: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.83176: done checking to see if all hosts have failed 8208 1726773022.83177: getting the remaining hosts for this loop 8208 1726773022.83178: done getting the remaining hosts for this loop 8208 1726773022.83181: getting the next task for host managed_node1 8208 1726773022.83189: done getting next task for host managed_node1 8208 1726773022.83192: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8208 1726773022.83196: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.83206: getting variables 8208 1726773022.83207: in VariableManager get_vars() 8208 1726773022.83239: Calling all_inventory to load vars for managed_node1 8208 1726773022.83241: Calling groups_inventory to load vars for managed_node1 8208 1726773022.83243: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.83251: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.83253: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.83258: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.83368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.83489: done with get_vars() 8208 1726773022.83498: done getting variables 8208 1726773022.83540: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:27 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.329) 0:00:03.509 **** 8208 1726773022.83566: entering _queue_task() for managed_node1/set_fact 8208 1726773022.83740: worker is 1 (out of 1 available) 8208 1726773022.83758: exiting _queue_task() for managed_node1/set_fact 8208 1726773022.83770: done queuing things up, now waiting for results queue to drain 8208 1726773022.83772: waiting for pending results... 8283 1726773022.83875: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists 8283 1726773022.83997: in run() - task 0affffe7-6841-f581-0619-0000000001bf 8283 1726773022.84017: variable 'ansible_search_path' from source: unknown 8283 1726773022.84021: variable 'ansible_search_path' from source: unknown 8283 1726773022.84051: calling self._execute() 8283 1726773022.84107: variable 'ansible_host' from source: host vars for 'managed_node1' 8283 1726773022.84116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8283 1726773022.84125: variable 'omit' from source: magic vars 8283 1726773022.84449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8283 1726773022.84676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8283 1726773022.84710: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8283 1726773022.84736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8283 1726773022.84764: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8283 1726773022.84826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8283 1726773022.84845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8283 1726773022.84865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8283 1726773022.84883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8283 1726773022.84973: Evaluated conditional (not __kernel_settings_is_transactional is defined): True 8283 1726773022.84982: variable 'omit' from source: magic vars 8283 1726773022.85026: variable 'omit' from source: magic vars 8283 1726773022.85107: variable '__transactional_update_stat' from source: set_fact 8283 1726773022.85143: variable 'omit' from source: magic vars 8283 1726773022.85165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8283 1726773022.85187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8283 1726773022.85204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8283 1726773022.85218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8283 1726773022.85228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8283 1726773022.85250: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8283 1726773022.85255: variable 'ansible_host' from source: host vars for 'managed_node1' 8283 1726773022.85259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8283 1726773022.85326: Set connection var ansible_shell_executable to /bin/sh 8283 1726773022.85332: Set connection var ansible_connection to ssh 8283 1726773022.85338: Set connection var ansible_module_compression to ZIP_DEFLATED 8283 1726773022.85345: Set connection var ansible_timeout to 10 8283 1726773022.85348: Set connection var ansible_shell_type to sh 8283 1726773022.85355: Set connection var ansible_pipelining to False 8283 1726773022.85373: variable 'ansible_shell_executable' from source: unknown 8283 1726773022.85377: variable 'ansible_connection' from source: unknown 8283 1726773022.85380: variable 'ansible_module_compression' from source: unknown 8283 1726773022.85384: variable 'ansible_shell_type' from source: unknown 8283 1726773022.85389: variable 'ansible_shell_executable' from source: unknown 8283 1726773022.85392: variable 'ansible_host' from source: host vars for 'managed_node1' 8283 1726773022.85397: variable 'ansible_pipelining' from source: unknown 8283 1726773022.85400: variable 'ansible_timeout' from source: unknown 8283 1726773022.85405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8283 1726773022.85466: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8283 1726773022.85478: variable 'omit' from source: magic vars 8283 1726773022.85483: starting attempt loop 8283 1726773022.85488: running the handler 8283 1726773022.85497: handler run complete 8283 1726773022.85505: attempt loop complete, returning result 8283 1726773022.85508: _execute() done 8283 1726773022.85511: dumping result to json 8283 1726773022.85514: done dumping result, returning 8283 1726773022.85520: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag if transactional-update exists [0affffe7-6841-f581-0619-0000000001bf] 8283 1726773022.85527: sending task result for task 0affffe7-6841-f581-0619-0000000001bf 8283 1726773022.85547: done sending task result for task 0affffe7-6841-f581-0619-0000000001bf 8283 1726773022.85550: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_is_transactional": false }, "changed": false } 8208 1726773022.85688: no more pending results, returning what we have 8208 1726773022.85691: results queue empty 8208 1726773022.85692: checking for any_errors_fatal 8208 1726773022.85697: done checking for any_errors_fatal 8208 1726773022.85697: checking for max_fail_percentage 8208 1726773022.85699: done checking for max_fail_percentage 8208 1726773022.85699: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.85700: done checking to see if all hosts have failed 8208 1726773022.85700: getting the remaining hosts for this loop 8208 1726773022.85702: done getting the remaining hosts for this loop 8208 1726773022.85705: getting the next task for host managed_node1 8208 1726773022.85712: done getting next task for host managed_node1 8208 1726773022.85715: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8208 1726773022.85718: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.85732: getting variables 8208 1726773022.85733: in VariableManager get_vars() 8208 1726773022.85763: Calling all_inventory to load vars for managed_node1 8208 1726773022.85765: Calling groups_inventory to load vars for managed_node1 8208 1726773022.85766: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.85772: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.85773: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.85775: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.85911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.86028: done with get_vars() 8208 1726773022.86035: done getting variables 8208 1726773022.86113: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set platform/version specific variables] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:31 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.025) 0:00:03.535 **** 8208 1726773022.86137: entering _queue_task() for managed_node1/include_vars 8208 1726773022.86138: Creating lock for include_vars 8208 1726773022.86303: worker is 1 (out of 1 available) 8208 1726773022.86318: exiting _queue_task() for managed_node1/include_vars 8208 1726773022.86329: done queuing things up, now waiting for results queue to drain 8208 1726773022.86331: waiting for pending results... 8284 1726773022.86435: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables 8284 1726773022.86554: in run() - task 0affffe7-6841-f581-0619-0000000001c1 8284 1726773022.86571: variable 'ansible_search_path' from source: unknown 8284 1726773022.86576: variable 'ansible_search_path' from source: unknown 8284 1726773022.86602: calling self._execute() 8284 1726773022.86654: variable 'ansible_host' from source: host vars for 'managed_node1' 8284 1726773022.86663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8284 1726773022.86669: variable 'omit' from source: magic vars 8284 1726773022.86739: variable 'omit' from source: magic vars 8284 1726773022.86783: variable 'omit' from source: magic vars 8284 1726773022.87038: variable 'ffparams' from source: task vars 8284 1726773022.87135: variable 'ansible_facts' from source: unknown 8284 1726773022.87260: variable 'ansible_facts' from source: unknown 8284 1726773022.87347: variable 'ansible_facts' from source: unknown 8284 1726773022.87433: variable 'ansible_facts' from source: unknown 8284 1726773022.87508: variable 'role_path' from source: magic vars 8284 1726773022.87629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 8284 1726773022.87789: Loaded config def from plugin (lookup/first_found) 8284 1726773022.87797: Loading LookupModule 'first_found' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/first_found.py 8284 1726773022.87824: variable 'ansible_search_path' from source: unknown 8284 1726773022.87841: variable 'ansible_search_path' from source: unknown 8284 1726773022.87846: variable 'ansible_search_path' from source: unknown 8284 1726773022.87851: variable 'ansible_search_path' from source: unknown 8284 1726773022.87856: variable 'ansible_search_path' from source: unknown 8284 1726773022.87869: variable 'omit' from source: magic vars 8284 1726773022.87897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8284 1726773022.87917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8284 1726773022.87933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8284 1726773022.87947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8284 1726773022.87956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8284 1726773022.87978: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8284 1726773022.87984: variable 'ansible_host' from source: host vars for 'managed_node1' 8284 1726773022.87990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8284 1726773022.88051: Set connection var ansible_shell_executable to /bin/sh 8284 1726773022.88056: Set connection var ansible_connection to ssh 8284 1726773022.88063: Set connection var ansible_module_compression to ZIP_DEFLATED 8284 1726773022.88070: Set connection var ansible_timeout to 10 8284 1726773022.88073: Set connection var ansible_shell_type to sh 8284 1726773022.88080: Set connection var ansible_pipelining to False 8284 1726773022.88098: variable 'ansible_shell_executable' from source: unknown 8284 1726773022.88102: variable 'ansible_connection' from source: unknown 8284 1726773022.88106: variable 'ansible_module_compression' from source: unknown 8284 1726773022.88109: variable 'ansible_shell_type' from source: unknown 8284 1726773022.88112: variable 'ansible_shell_executable' from source: unknown 8284 1726773022.88116: variable 'ansible_host' from source: host vars for 'managed_node1' 8284 1726773022.88120: variable 'ansible_pipelining' from source: unknown 8284 1726773022.88123: variable 'ansible_timeout' from source: unknown 8284 1726773022.88127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8284 1726773022.88195: Loading ActionModule 'include_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/include_vars.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8284 1726773022.88207: variable 'omit' from source: magic vars 8284 1726773022.88212: starting attempt loop 8284 1726773022.88215: running the handler 8284 1726773022.88258: handler run complete 8284 1726773022.88268: attempt loop complete, returning result 8284 1726773022.88271: _execute() done 8284 1726773022.88275: dumping result to json 8284 1726773022.88279: done dumping result, returning 8284 1726773022.88287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set platform/version specific variables [0affffe7-6841-f581-0619-0000000001c1] 8284 1726773022.88294: sending task result for task 0affffe7-6841-f581-0619-0000000001c1 8284 1726773022.88316: done sending task result for task 0affffe7-6841-f581-0619-0000000001c1 8284 1726773022.88318: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_packages": [ "tuned", "python3-configobj" ], "__kernel_settings_services": [ "tuned" ] }, "ansible_included_var_files": [ "/tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/vars/default.yml" ], "changed": false } 8208 1726773022.88559: no more pending results, returning what we have 8208 1726773022.88562: results queue empty 8208 1726773022.88562: checking for any_errors_fatal 8208 1726773022.88565: done checking for any_errors_fatal 8208 1726773022.88565: checking for max_fail_percentage 8208 1726773022.88566: done checking for max_fail_percentage 8208 1726773022.88566: checking to see if all hosts have failed and the running result is not ok 8208 1726773022.88567: done checking to see if all hosts have failed 8208 1726773022.88567: getting the remaining hosts for this loop 8208 1726773022.88568: done getting the remaining hosts for this loop 8208 1726773022.88570: getting the next task for host managed_node1 8208 1726773022.88576: done getting next task for host managed_node1 8208 1726773022.88579: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8208 1726773022.88581: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773022.88589: getting variables 8208 1726773022.88590: in VariableManager get_vars() 8208 1726773022.88613: Calling all_inventory to load vars for managed_node1 8208 1726773022.88615: Calling groups_inventory to load vars for managed_node1 8208 1726773022.88616: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773022.88625: Calling all_plugins_play to load vars for managed_node1 8208 1726773022.88627: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773022.88628: Calling groups_plugins_play to load vars for managed_node1 8208 1726773022.88728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773022.88843: done with get_vars() 8208 1726773022.88850: done getting variables 8208 1726773022.88920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required packages are installed] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Thursday 19 September 2024 15:10:22 -0400 (0:00:00.028) 0:00:03.563 **** 8208 1726773022.88942: entering _queue_task() for managed_node1/package 8208 1726773022.88943: Creating lock for package 8208 1726773022.89117: worker is 1 (out of 1 available) 8208 1726773022.89132: exiting _queue_task() for managed_node1/package 8208 1726773022.89143: done queuing things up, now waiting for results queue to drain 8208 1726773022.89144: waiting for pending results... 8285 1726773022.89247: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed 8285 1726773022.89353: in run() - task 0affffe7-6841-f581-0619-00000000013f 8285 1726773022.89371: variable 'ansible_search_path' from source: unknown 8285 1726773022.89375: variable 'ansible_search_path' from source: unknown 8285 1726773022.89403: calling self._execute() 8285 1726773022.89457: variable 'ansible_host' from source: host vars for 'managed_node1' 8285 1726773022.89465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8285 1726773022.89474: variable 'omit' from source: magic vars 8285 1726773022.89546: variable 'omit' from source: magic vars 8285 1726773022.89578: variable 'omit' from source: magic vars 8285 1726773022.89598: variable '__kernel_settings_packages' from source: include_vars 8285 1726773022.89860: variable '__kernel_settings_packages' from source: include_vars 8285 1726773022.90010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8285 1726773022.91626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8285 1726773022.91674: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8285 1726773022.91715: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8285 1726773022.91742: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8285 1726773022.91765: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8285 1726773022.91833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8285 1726773022.91854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8285 1726773022.91874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8285 1726773022.91904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8285 1726773022.91917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8285 1726773022.91990: variable '__kernel_settings_is_ostree' from source: set_fact 8285 1726773022.91997: variable 'omit' from source: magic vars 8285 1726773022.92018: variable 'omit' from source: magic vars 8285 1726773022.92039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8285 1726773022.92057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8285 1726773022.92071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8285 1726773022.92082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8285 1726773022.92102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8285 1726773022.92124: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8285 1726773022.92129: variable 'ansible_host' from source: host vars for 'managed_node1' 8285 1726773022.92133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8285 1726773022.92199: Set connection var ansible_shell_executable to /bin/sh 8285 1726773022.92203: Set connection var ansible_connection to ssh 8285 1726773022.92207: Set connection var ansible_module_compression to ZIP_DEFLATED 8285 1726773022.92212: Set connection var ansible_timeout to 10 8285 1726773022.92214: Set connection var ansible_shell_type to sh 8285 1726773022.92218: Set connection var ansible_pipelining to False 8285 1726773022.92233: variable 'ansible_shell_executable' from source: unknown 8285 1726773022.92236: variable 'ansible_connection' from source: unknown 8285 1726773022.92238: variable 'ansible_module_compression' from source: unknown 8285 1726773022.92239: variable 'ansible_shell_type' from source: unknown 8285 1726773022.92241: variable 'ansible_shell_executable' from source: unknown 8285 1726773022.92243: variable 'ansible_host' from source: host vars for 'managed_node1' 8285 1726773022.92246: variable 'ansible_pipelining' from source: unknown 8285 1726773022.92248: variable 'ansible_timeout' from source: unknown 8285 1726773022.92250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8285 1726773022.92310: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8285 1726773022.92319: variable 'omit' from source: magic vars 8285 1726773022.92323: starting attempt loop 8285 1726773022.92326: running the handler 8285 1726773022.92386: variable 'ansible_facts' from source: unknown 8285 1726773022.92464: _low_level_execute_command(): starting 8285 1726773022.92471: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8285 1726773022.94918: stdout chunk (state=2): >>>/root <<< 8285 1726773022.95050: stderr chunk (state=3): >>><<< 8285 1726773022.95062: stdout chunk (state=3): >>><<< 8285 1726773022.95086: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8285 1726773022.95102: _low_level_execute_command(): starting 8285 1726773022.95110: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620 `" && echo ansible-tmp-1726773022.9509754-8285-99502191974620="` echo /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620 `" ) && sleep 0' 8285 1726773022.97918: stdout chunk (state=2): >>>ansible-tmp-1726773022.9509754-8285-99502191974620=/root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620 <<< 8285 1726773022.98150: stderr chunk (state=3): >>><<< 8285 1726773022.98163: stdout chunk (state=3): >>><<< 8285 1726773022.98181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773022.9509754-8285-99502191974620=/root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620 , stderr= 8285 1726773022.98214: variable 'ansible_module_compression' from source: unknown 8285 1726773022.98274: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 8285 1726773022.98280: ANSIBALLZ: Acquiring lock 8285 1726773022.98284: ANSIBALLZ: Lock acquired: 139627423671568 8285 1726773022.98291: ANSIBALLZ: Creating module 8285 1726773023.17489: ANSIBALLZ: Writing module into payload 8285 1726773023.17763: ANSIBALLZ: Writing module 8285 1726773023.17792: ANSIBALLZ: Renaming module 8285 1726773023.17800: ANSIBALLZ: Done creating module 8285 1726773023.17819: variable 'ansible_facts' from source: unknown 8285 1726773023.18080: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/AnsiballZ_dnf.py 8285 1726773023.18532: Sending initial data 8285 1726773023.18539: Sent initial data (149 bytes) 8285 1726773023.21392: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpgd7criji /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/AnsiballZ_dnf.py <<< 8285 1726773023.23368: stderr chunk (state=3): >>><<< 8285 1726773023.23380: stdout chunk (state=3): >>><<< 8285 1726773023.23406: done transferring module to remote 8285 1726773023.23419: _low_level_execute_command(): starting 8285 1726773023.23426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/ /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/AnsiballZ_dnf.py && sleep 0' 8285 1726773023.26155: stderr chunk (state=2): >>><<< 8285 1726773023.26168: stdout chunk (state=2): >>><<< 8285 1726773023.26188: _low_level_execute_command() done: rc=0, stdout=, stderr= 8285 1726773023.26193: _low_level_execute_command(): starting 8285 1726773023.26199: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/AnsiballZ_dnf.py && sleep 0' 8285 1726773028.46430: stdout chunk (state=2): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} <<< 8285 1726773028.54013: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8285 1726773028.54064: stderr chunk (state=3): >>><<< 8285 1726773028.54074: stdout chunk (state=3): >>><<< 8285 1726773028.54092: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["tuned", "python3-configobj"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "allowerasing": false, "nobest": false, "use_backend": "auto", "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8285 1726773028.54125: done with _execute_module (ansible.legacy.dnf, {'name': ['tuned', 'python3-configobj'], 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8285 1726773028.54133: _low_level_execute_command(): starting 8285 1726773028.54139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773022.9509754-8285-99502191974620/ > /dev/null 2>&1 && sleep 0' 8285 1726773028.56673: stderr chunk (state=2): >>><<< 8285 1726773028.56684: stdout chunk (state=2): >>><<< 8285 1726773028.56702: _low_level_execute_command() done: rc=0, stdout=, stderr= 8285 1726773028.56710: handler run complete 8285 1726773028.56739: attempt loop complete, returning result 8285 1726773028.56744: _execute() done 8285 1726773028.56747: dumping result to json 8285 1726773028.56754: done dumping result, returning 8285 1726773028.56764: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required packages are installed [0affffe7-6841-f581-0619-00000000013f] 8285 1726773028.56771: sending task result for task 0affffe7-6841-f581-0619-00000000013f 8285 1726773028.56801: done sending task result for task 0affffe7-6841-f581-0619-00000000013f 8285 1726773028.56805: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 8208 1726773028.56968: no more pending results, returning what we have 8208 1726773028.56971: results queue empty 8208 1726773028.56972: checking for any_errors_fatal 8208 1726773028.56977: done checking for any_errors_fatal 8208 1726773028.56978: checking for max_fail_percentage 8208 1726773028.56979: done checking for max_fail_percentage 8208 1726773028.56979: checking to see if all hosts have failed and the running result is not ok 8208 1726773028.56980: done checking to see if all hosts have failed 8208 1726773028.56981: getting the remaining hosts for this loop 8208 1726773028.56982: done getting the remaining hosts for this loop 8208 1726773028.56986: getting the next task for host managed_node1 8208 1726773028.56994: done getting next task for host managed_node1 8208 1726773028.56998: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8208 1726773028.57001: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773028.57010: getting variables 8208 1726773028.57011: in VariableManager get_vars() 8208 1726773028.57043: Calling all_inventory to load vars for managed_node1 8208 1726773028.57046: Calling groups_inventory to load vars for managed_node1 8208 1726773028.57049: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773028.57057: Calling all_plugins_play to load vars for managed_node1 8208 1726773028.57059: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773028.57062: Calling groups_plugins_play to load vars for managed_node1 8208 1726773028.57225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773028.57345: done with get_vars() 8208 1726773028.57354: done getting variables 8208 1726773028.57429: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:24 Thursday 19 September 2024 15:10:28 -0400 (0:00:05.685) 0:00:09.248 **** 8208 1726773028.57454: entering _queue_task() for managed_node1/debug 8208 1726773028.57455: Creating lock for debug 8208 1726773028.57650: worker is 1 (out of 1 available) 8208 1726773028.57667: exiting _queue_task() for managed_node1/debug 8208 1726773028.57676: done queuing things up, now waiting for results queue to drain 8208 1726773028.57678: waiting for pending results... 8403 1726773028.57790: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes 8403 1726773028.57909: in run() - task 0affffe7-6841-f581-0619-000000000141 8403 1726773028.57926: variable 'ansible_search_path' from source: unknown 8403 1726773028.57930: variable 'ansible_search_path' from source: unknown 8403 1726773028.57959: calling self._execute() 8403 1726773028.58022: variable 'ansible_host' from source: host vars for 'managed_node1' 8403 1726773028.58030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8403 1726773028.58038: variable 'omit' from source: magic vars 8403 1726773028.58380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8403 1726773028.60148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8403 1726773028.60205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8403 1726773028.60234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8403 1726773028.60261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8403 1726773028.60289: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8403 1726773028.60346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8403 1726773028.60370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8403 1726773028.60392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8403 1726773028.60421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8403 1726773028.60433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8403 1726773028.60513: variable '__kernel_settings_is_transactional' from source: set_fact 8403 1726773028.60531: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8403 1726773028.60536: when evaluation is False, skipping this task 8403 1726773028.60540: _execute() done 8403 1726773028.60543: dumping result to json 8403 1726773028.60547: done dumping result, returning 8403 1726773028.60553: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Notify user that reboot is needed to apply changes [0affffe7-6841-f581-0619-000000000141] 8403 1726773028.60560: sending task result for task 0affffe7-6841-f581-0619-000000000141 8403 1726773028.60584: done sending task result for task 0affffe7-6841-f581-0619-000000000141 8403 1726773028.60589: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "__kernel_settings_is_transactional | d(false)" } 8208 1726773028.60699: no more pending results, returning what we have 8208 1726773028.60702: results queue empty 8208 1726773028.60702: checking for any_errors_fatal 8208 1726773028.60709: done checking for any_errors_fatal 8208 1726773028.60710: checking for max_fail_percentage 8208 1726773028.60711: done checking for max_fail_percentage 8208 1726773028.60714: checking to see if all hosts have failed and the running result is not ok 8208 1726773028.60715: done checking to see if all hosts have failed 8208 1726773028.60716: getting the remaining hosts for this loop 8208 1726773028.60717: done getting the remaining hosts for this loop 8208 1726773028.60720: getting the next task for host managed_node1 8208 1726773028.60726: done getting next task for host managed_node1 8208 1726773028.60730: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8208 1726773028.60733: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773028.60746: getting variables 8208 1726773028.60747: in VariableManager get_vars() 8208 1726773028.60782: Calling all_inventory to load vars for managed_node1 8208 1726773028.60784: Calling groups_inventory to load vars for managed_node1 8208 1726773028.60788: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773028.60796: Calling all_plugins_play to load vars for managed_node1 8208 1726773028.60798: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773028.60801: Calling groups_plugins_play to load vars for managed_node1 8208 1726773028.60921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773028.61040: done with get_vars() 8208 1726773028.61047: done getting variables 8208 1726773028.61144: Loading ActionModule 'reboot' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/reboot.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Reboot transactional update systems] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:29 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.037) 0:00:09.285 **** 8208 1726773028.61168: entering _queue_task() for managed_node1/reboot 8208 1726773028.61169: Creating lock for reboot 8208 1726773028.61352: worker is 1 (out of 1 available) 8208 1726773028.61366: exiting _queue_task() for managed_node1/reboot 8208 1726773028.61377: done queuing things up, now waiting for results queue to drain 8208 1726773028.61380: waiting for pending results... 8404 1726773028.61497: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems 8404 1726773028.61610: in run() - task 0affffe7-6841-f581-0619-000000000142 8404 1726773028.61627: variable 'ansible_search_path' from source: unknown 8404 1726773028.61631: variable 'ansible_search_path' from source: unknown 8404 1726773028.61661: calling self._execute() 8404 1726773028.61788: variable 'ansible_host' from source: host vars for 'managed_node1' 8404 1726773028.61796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8404 1726773028.61804: variable 'omit' from source: magic vars 8404 1726773028.62135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8404 1726773028.63835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8404 1726773028.63899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8404 1726773028.63930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8404 1726773028.63957: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8404 1726773028.63982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8404 1726773028.64037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8404 1726773028.64058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8404 1726773028.64077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8404 1726773028.64116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8404 1726773028.64128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8404 1726773028.64207: variable '__kernel_settings_is_transactional' from source: set_fact 8404 1726773028.64225: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8404 1726773028.64231: when evaluation is False, skipping this task 8404 1726773028.64235: _execute() done 8404 1726773028.64238: dumping result to json 8404 1726773028.64242: done dumping result, returning 8404 1726773028.64248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Reboot transactional update systems [0affffe7-6841-f581-0619-000000000142] 8404 1726773028.64253: sending task result for task 0affffe7-6841-f581-0619-000000000142 8404 1726773028.64278: done sending task result for task 0affffe7-6841-f581-0619-000000000142 8404 1726773028.64281: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8208 1726773028.64462: no more pending results, returning what we have 8208 1726773028.64465: results queue empty 8208 1726773028.64465: checking for any_errors_fatal 8208 1726773028.64469: done checking for any_errors_fatal 8208 1726773028.64469: checking for max_fail_percentage 8208 1726773028.64471: done checking for max_fail_percentage 8208 1726773028.64471: checking to see if all hosts have failed and the running result is not ok 8208 1726773028.64472: done checking to see if all hosts have failed 8208 1726773028.64472: getting the remaining hosts for this loop 8208 1726773028.64473: done getting the remaining hosts for this loop 8208 1726773028.64476: getting the next task for host managed_node1 8208 1726773028.64482: done getting next task for host managed_node1 8208 1726773028.64486: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8208 1726773028.64490: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773028.64502: getting variables 8208 1726773028.64504: in VariableManager get_vars() 8208 1726773028.64529: Calling all_inventory to load vars for managed_node1 8208 1726773028.64531: Calling groups_inventory to load vars for managed_node1 8208 1726773028.64533: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773028.64540: Calling all_plugins_play to load vars for managed_node1 8208 1726773028.64542: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773028.64544: Calling groups_plugins_play to load vars for managed_node1 8208 1726773028.64639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773028.64758: done with get_vars() 8208 1726773028.64766: done getting variables 8208 1726773028.64806: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:34 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.036) 0:00:09.322 **** 8208 1726773028.64828: entering _queue_task() for managed_node1/fail 8208 1726773028.64990: worker is 1 (out of 1 available) 8208 1726773028.65005: exiting _queue_task() for managed_node1/fail 8208 1726773028.65015: done queuing things up, now waiting for results queue to drain 8208 1726773028.65017: waiting for pending results... 8405 1726773028.65131: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set 8405 1726773028.65243: in run() - task 0affffe7-6841-f581-0619-000000000143 8405 1726773028.65260: variable 'ansible_search_path' from source: unknown 8405 1726773028.65266: variable 'ansible_search_path' from source: unknown 8405 1726773028.65295: calling self._execute() 8405 1726773028.65353: variable 'ansible_host' from source: host vars for 'managed_node1' 8405 1726773028.65361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8405 1726773028.65372: variable 'omit' from source: magic vars 8405 1726773028.65706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8405 1726773028.67406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8405 1726773028.67454: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8405 1726773028.67488: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8405 1726773028.67516: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8405 1726773028.67548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8405 1726773028.67609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8405 1726773028.67631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8405 1726773028.67649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8405 1726773028.67678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8405 1726773028.67692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8405 1726773028.67772: variable '__kernel_settings_is_transactional' from source: set_fact 8405 1726773028.67791: Evaluated conditional (__kernel_settings_is_transactional | d(false)): False 8405 1726773028.67796: when evaluation is False, skipping this task 8405 1726773028.67799: _execute() done 8405 1726773028.67803: dumping result to json 8405 1726773028.67806: done dumping result, returning 8405 1726773028.67812: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Fail if reboot is needed and not set [0affffe7-6841-f581-0619-000000000143] 8405 1726773028.67817: sending task result for task 0affffe7-6841-f581-0619-000000000143 8405 1726773028.67837: done sending task result for task 0affffe7-6841-f581-0619-000000000143 8405 1726773028.67839: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_is_transactional | d(false)", "skip_reason": "Conditional result was False" } 8208 1726773028.67990: no more pending results, returning what we have 8208 1726773028.67993: results queue empty 8208 1726773028.67994: checking for any_errors_fatal 8208 1726773028.67998: done checking for any_errors_fatal 8208 1726773028.67999: checking for max_fail_percentage 8208 1726773028.68000: done checking for max_fail_percentage 8208 1726773028.68001: checking to see if all hosts have failed and the running result is not ok 8208 1726773028.68002: done checking to see if all hosts have failed 8208 1726773028.68002: getting the remaining hosts for this loop 8208 1726773028.68004: done getting the remaining hosts for this loop 8208 1726773028.68007: getting the next task for host managed_node1 8208 1726773028.68015: done getting next task for host managed_node1 8208 1726773028.68018: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8208 1726773028.68023: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773028.68036: getting variables 8208 1726773028.68038: in VariableManager get_vars() 8208 1726773028.68070: Calling all_inventory to load vars for managed_node1 8208 1726773028.68073: Calling groups_inventory to load vars for managed_node1 8208 1726773028.68075: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773028.68082: Calling all_plugins_play to load vars for managed_node1 8208 1726773028.68086: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773028.68089: Calling groups_plugins_play to load vars for managed_node1 8208 1726773028.68208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773028.68499: done with get_vars() 8208 1726773028.68506: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Read tuned main config] ****** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 Thursday 19 September 2024 15:10:28 -0400 (0:00:00.037) 0:00:09.359 **** 8208 1726773028.68561: entering _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8208 1726773028.68562: Creating lock for fedora.linux_system_roles.kernel_settings_get_config 8208 1726773028.68744: worker is 1 (out of 1 available) 8208 1726773028.68759: exiting _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8208 1726773028.68768: done queuing things up, now waiting for results queue to drain 8208 1726773028.68770: waiting for pending results... 8406 1726773028.68888: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config 8406 1726773028.68997: in run() - task 0affffe7-6841-f581-0619-000000000145 8406 1726773028.69016: variable 'ansible_search_path' from source: unknown 8406 1726773028.69020: variable 'ansible_search_path' from source: unknown 8406 1726773028.69047: calling self._execute() 8406 1726773028.69113: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726773028.69122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726773028.69130: variable 'omit' from source: magic vars 8406 1726773028.69213: variable 'omit' from source: magic vars 8406 1726773028.69248: variable 'omit' from source: magic vars 8406 1726773028.69268: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8406 1726773028.69493: variable '__kernel_settings_tuned_main_conf_file' from source: role '' all vars 8406 1726773028.69555: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8406 1726773028.69583: variable 'omit' from source: magic vars 8406 1726773028.69618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8406 1726773028.69647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8406 1726773028.69668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8406 1726773028.69682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726773028.69695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8406 1726773028.69718: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8406 1726773028.69723: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726773028.69728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726773028.69801: Set connection var ansible_shell_executable to /bin/sh 8406 1726773028.69806: Set connection var ansible_connection to ssh 8406 1726773028.69812: Set connection var ansible_module_compression to ZIP_DEFLATED 8406 1726773028.69820: Set connection var ansible_timeout to 10 8406 1726773028.69823: Set connection var ansible_shell_type to sh 8406 1726773028.69830: Set connection var ansible_pipelining to False 8406 1726773028.69848: variable 'ansible_shell_executable' from source: unknown 8406 1726773028.69852: variable 'ansible_connection' from source: unknown 8406 1726773028.69855: variable 'ansible_module_compression' from source: unknown 8406 1726773028.69857: variable 'ansible_shell_type' from source: unknown 8406 1726773028.69859: variable 'ansible_shell_executable' from source: unknown 8406 1726773028.69860: variable 'ansible_host' from source: host vars for 'managed_node1' 8406 1726773028.69863: variable 'ansible_pipelining' from source: unknown 8406 1726773028.69867: variable 'ansible_timeout' from source: unknown 8406 1726773028.69870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8406 1726773028.70014: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8406 1726773028.70025: variable 'omit' from source: magic vars 8406 1726773028.70031: starting attempt loop 8406 1726773028.70035: running the handler 8406 1726773028.70046: _low_level_execute_command(): starting 8406 1726773028.70053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8406 1726773028.72507: stdout chunk (state=2): >>>/root <<< 8406 1726773028.72629: stderr chunk (state=3): >>><<< 8406 1726773028.72638: stdout chunk (state=3): >>><<< 8406 1726773028.72660: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8406 1726773028.72675: _low_level_execute_command(): starting 8406 1726773028.72680: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580 `" && echo ansible-tmp-1726773028.7267008-8406-92298958984580="` echo /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580 `" ) && sleep 0' 8406 1726773028.75213: stdout chunk (state=2): >>>ansible-tmp-1726773028.7267008-8406-92298958984580=/root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580 <<< 8406 1726773028.75343: stderr chunk (state=3): >>><<< 8406 1726773028.75352: stdout chunk (state=3): >>><<< 8406 1726773028.75372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773028.7267008-8406-92298958984580=/root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580 , stderr= 8406 1726773028.75415: variable 'ansible_module_compression' from source: unknown 8406 1726773028.75454: ANSIBALLZ: Using lock for fedora.linux_system_roles.kernel_settings_get_config 8406 1726773028.75460: ANSIBALLZ: Acquiring lock 8406 1726773028.75463: ANSIBALLZ: Lock acquired: 139627421004624 8406 1726773028.75470: ANSIBALLZ: Creating module 8406 1726773028.84782: ANSIBALLZ: Writing module into payload 8406 1726773028.84841: ANSIBALLZ: Writing module 8406 1726773028.84863: ANSIBALLZ: Renaming module 8406 1726773028.84872: ANSIBALLZ: Done creating module 8406 1726773028.84892: variable 'ansible_facts' from source: unknown 8406 1726773028.84945: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/AnsiballZ_kernel_settings_get_config.py 8406 1726773028.85046: Sending initial data 8406 1726773028.85053: Sent initial data (172 bytes) 8406 1726773028.87831: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpsloj3d0_ /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/AnsiballZ_kernel_settings_get_config.py <<< 8406 1726773028.89238: stderr chunk (state=3): >>><<< 8406 1726773028.89247: stdout chunk (state=3): >>><<< 8406 1726773028.89270: done transferring module to remote 8406 1726773028.89282: _low_level_execute_command(): starting 8406 1726773028.89289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/ /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8406 1726773028.91774: stderr chunk (state=2): >>><<< 8406 1726773028.91786: stdout chunk (state=2): >>><<< 8406 1726773028.91802: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726773028.91806: _low_level_execute_command(): starting 8406 1726773028.91811: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8406 1726773029.07303: stdout chunk (state=2): >>> {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} <<< 8406 1726773029.08328: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8406 1726773029.08372: stderr chunk (state=3): >>><<< 8406 1726773029.08379: stdout chunk (state=3): >>><<< 8406 1726773029.08399: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {"daemon": "1", "dynamic_tuning": "0", "sleep_interval": "1", "update_interval": "10", "recommend_command": "1", "reapply_sysctl": "1", "default_instance_priority": "0", "udev_buffer_size": "1MB", "log_file_count": "2", "log_file_max_size": "1MB"}, "invocation": {"module_args": {"path": "/etc/tuned/tuned-main.conf"}}} , stderr=Shared connection to 10.31.43.7 closed. 8406 1726773029.08422: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/tuned-main.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8406 1726773029.08432: _low_level_execute_command(): starting 8406 1726773029.08438: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773028.7267008-8406-92298958984580/ > /dev/null 2>&1 && sleep 0' 8406 1726773029.10925: stderr chunk (state=2): >>><<< 8406 1726773029.10933: stdout chunk (state=2): >>><<< 8406 1726773029.10947: _low_level_execute_command() done: rc=0, stdout=, stderr= 8406 1726773029.10955: handler run complete 8406 1726773029.10972: attempt loop complete, returning result 8406 1726773029.10977: _execute() done 8406 1726773029.10980: dumping result to json 8406 1726773029.10984: done dumping result, returning 8406 1726773029.10994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Read tuned main config [0affffe7-6841-f581-0619-000000000145] 8406 1726773029.10999: sending task result for task 0affffe7-6841-f581-0619-000000000145 8406 1726773029.11029: done sending task result for task 0affffe7-6841-f581-0619-000000000145 8406 1726773029.11032: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "data": { "daemon": "1", "default_instance_priority": "0", "dynamic_tuning": "0", "log_file_count": "2", "log_file_max_size": "1MB", "reapply_sysctl": "1", "recommend_command": "1", "sleep_interval": "1", "udev_buffer_size": "1MB", "update_interval": "10" } } 8208 1726773029.11308: no more pending results, returning what we have 8208 1726773029.11311: results queue empty 8208 1726773029.11311: checking for any_errors_fatal 8208 1726773029.11315: done checking for any_errors_fatal 8208 1726773029.11316: checking for max_fail_percentage 8208 1726773029.11317: done checking for max_fail_percentage 8208 1726773029.11317: checking to see if all hosts have failed and the running result is not ok 8208 1726773029.11318: done checking to see if all hosts have failed 8208 1726773029.11318: getting the remaining hosts for this loop 8208 1726773029.11319: done getting the remaining hosts for this loop 8208 1726773029.11322: getting the next task for host managed_node1 8208 1726773029.11327: done getting next task for host managed_node1 8208 1726773029.11329: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8208 1726773029.11331: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773029.11338: getting variables 8208 1726773029.11339: in VariableManager get_vars() 8208 1726773029.11364: Calling all_inventory to load vars for managed_node1 8208 1726773029.11366: Calling groups_inventory to load vars for managed_node1 8208 1726773029.11368: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773029.11374: Calling all_plugins_play to load vars for managed_node1 8208 1726773029.11375: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773029.11377: Calling groups_plugins_play to load vars for managed_node1 8208 1726773029.11489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773029.11620: done with get_vars() 8208 1726773029.11628: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 Thursday 19 September 2024 15:10:29 -0400 (0:00:00.431) 0:00:09.791 **** 8208 1726773029.11695: entering _queue_task() for managed_node1/stat 8208 1726773029.11861: worker is 1 (out of 1 available) 8208 1726773029.11876: exiting _queue_task() for managed_node1/stat 8208 1726773029.11889: done queuing things up, now waiting for results queue to drain 8208 1726773029.11891: waiting for pending results... 8421 1726773029.12008: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory 8421 1726773029.12123: in run() - task 0affffe7-6841-f581-0619-000000000146 8421 1726773029.12139: variable 'ansible_search_path' from source: unknown 8421 1726773029.12143: variable 'ansible_search_path' from source: unknown 8421 1726773029.12182: variable '__prof_from_conf' from source: task vars 8421 1726773029.12425: variable '__prof_from_conf' from source: task vars 8421 1726773029.12556: variable '__data' from source: task vars 8421 1726773029.12613: variable '__kernel_settings_register_tuned_main' from source: set_fact 8421 1726773029.12750: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8421 1726773029.12761: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8421 1726773029.12809: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8421 1726773029.12823: variable 'omit' from source: magic vars 8421 1726773029.12951: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.12962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.12973: variable 'omit' from source: magic vars 8421 1726773029.13148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8421 1726773029.14663: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8421 1726773029.14720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8421 1726773029.14751: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8421 1726773029.14777: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8421 1726773029.14799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8421 1726773029.14855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8421 1726773029.14877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8421 1726773029.14897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8421 1726773029.14925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8421 1726773029.14938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8421 1726773029.15006: variable 'item' from source: unknown 8421 1726773029.15023: Evaluated conditional (item | length > 0): False 8421 1726773029.15028: when evaluation is False, skipping this task 8421 1726773029.15054: variable 'item' from source: unknown 8421 1726773029.15104: variable 'item' from source: unknown skipping: [managed_node1] => (item=) => { "ansible_loop_var": "item", "changed": false, "false_condition": "item | length > 0", "item": "", "skip_reason": "Conditional result was False" } 8421 1726773029.15178: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.15189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.15199: variable 'omit' from source: magic vars 8421 1726773029.15315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8421 1726773029.15333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8421 1726773029.15350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8421 1726773029.15380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8421 1726773029.15393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8421 1726773029.15443: variable 'item' from source: unknown 8421 1726773029.15452: Evaluated conditional (item | length > 0): True 8421 1726773029.15458: variable 'omit' from source: magic vars 8421 1726773029.15495: variable 'omit' from source: magic vars 8421 1726773029.15525: variable 'item' from source: unknown 8421 1726773029.15569: variable 'item' from source: unknown 8421 1726773029.15583: variable 'omit' from source: magic vars 8421 1726773029.15608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8421 1726773029.15628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8421 1726773029.15643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8421 1726773029.15656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8421 1726773029.15665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8421 1726773029.15690: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8421 1726773029.15695: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.15699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.15759: Set connection var ansible_shell_executable to /bin/sh 8421 1726773029.15764: Set connection var ansible_connection to ssh 8421 1726773029.15769: Set connection var ansible_module_compression to ZIP_DEFLATED 8421 1726773029.15773: Set connection var ansible_timeout to 10 8421 1726773029.15775: Set connection var ansible_shell_type to sh 8421 1726773029.15779: Set connection var ansible_pipelining to False 8421 1726773029.15794: variable 'ansible_shell_executable' from source: unknown 8421 1726773029.15797: variable 'ansible_connection' from source: unknown 8421 1726773029.15800: variable 'ansible_module_compression' from source: unknown 8421 1726773029.15802: variable 'ansible_shell_type' from source: unknown 8421 1726773029.15803: variable 'ansible_shell_executable' from source: unknown 8421 1726773029.15805: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.15807: variable 'ansible_pipelining' from source: unknown 8421 1726773029.15809: variable 'ansible_timeout' from source: unknown 8421 1726773029.15812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.15897: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8421 1726773029.15905: variable 'omit' from source: magic vars 8421 1726773029.15910: starting attempt loop 8421 1726773029.15912: running the handler 8421 1726773029.15922: _low_level_execute_command(): starting 8421 1726773029.15927: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8421 1726773029.18332: stdout chunk (state=2): >>>/root <<< 8421 1726773029.18445: stderr chunk (state=3): >>><<< 8421 1726773029.18452: stdout chunk (state=3): >>><<< 8421 1726773029.18471: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8421 1726773029.18485: _low_level_execute_command(): starting 8421 1726773029.18494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576 `" && echo ansible-tmp-1726773029.1848083-8421-24713679460576="` echo /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576 `" ) && sleep 0' 8421 1726773029.21000: stdout chunk (state=2): >>>ansible-tmp-1726773029.1848083-8421-24713679460576=/root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576 <<< 8421 1726773029.21129: stderr chunk (state=3): >>><<< 8421 1726773029.21138: stdout chunk (state=3): >>><<< 8421 1726773029.21154: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.1848083-8421-24713679460576=/root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576 , stderr= 8421 1726773029.21197: variable 'ansible_module_compression' from source: unknown 8421 1726773029.21239: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8421 1726773029.21270: variable 'ansible_facts' from source: unknown 8421 1726773029.21336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/AnsiballZ_stat.py 8421 1726773029.21438: Sending initial data 8421 1726773029.21445: Sent initial data (150 bytes) 8421 1726773029.24109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpnp06x_8h /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/AnsiballZ_stat.py <<< 8421 1726773029.25532: stderr chunk (state=3): >>><<< 8421 1726773029.25541: stdout chunk (state=3): >>><<< 8421 1726773029.25563: done transferring module to remote 8421 1726773029.25574: _low_level_execute_command(): starting 8421 1726773029.25578: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/ /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/AnsiballZ_stat.py && sleep 0' 8421 1726773029.28004: stderr chunk (state=2): >>><<< 8421 1726773029.28014: stdout chunk (state=2): >>><<< 8421 1726773029.28030: _low_level_execute_command() done: rc=0, stdout=, stderr= 8421 1726773029.28034: _low_level_execute_command(): starting 8421 1726773029.28040: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/AnsiballZ_stat.py && sleep 0' 8421 1726773029.42935: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8421 1726773029.43913: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8421 1726773029.43960: stderr chunk (state=3): >>><<< 8421 1726773029.43970: stdout chunk (state=3): >>><<< 8421 1726773029.43987: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/profiles", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.43.7 closed. 8421 1726773029.44009: done with _execute_module (stat, {'path': '/etc/tuned/profiles', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8421 1726773029.44020: _low_level_execute_command(): starting 8421 1726773029.44025: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.1848083-8421-24713679460576/ > /dev/null 2>&1 && sleep 0' 8421 1726773029.46547: stderr chunk (state=2): >>><<< 8421 1726773029.46557: stdout chunk (state=2): >>><<< 8421 1726773029.46574: _low_level_execute_command() done: rc=0, stdout=, stderr= 8421 1726773029.46582: handler run complete 8421 1726773029.46598: attempt loop complete, returning result 8421 1726773029.46617: variable 'item' from source: unknown 8421 1726773029.46683: variable 'item' from source: unknown ok: [managed_node1] => (item=/etc/tuned/profiles) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned/profiles", "stat": { "exists": false } } 8421 1726773029.46775: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.46787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.46797: variable 'omit' from source: magic vars 8421 1726773029.46910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8421 1726773029.46934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8421 1726773029.46953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8421 1726773029.46982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8421 1726773029.46995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8421 1726773029.47055: variable 'item' from source: unknown 8421 1726773029.47065: Evaluated conditional (item | length > 0): True 8421 1726773029.47071: variable 'omit' from source: magic vars 8421 1726773029.47082: variable 'omit' from source: magic vars 8421 1726773029.47113: variable 'item' from source: unknown 8421 1726773029.47159: variable 'item' from source: unknown 8421 1726773029.47174: variable 'omit' from source: magic vars 8421 1726773029.47192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8421 1726773029.47201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8421 1726773029.47207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8421 1726773029.47219: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8421 1726773029.47223: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.47227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.47280: Set connection var ansible_shell_executable to /bin/sh 8421 1726773029.47286: Set connection var ansible_connection to ssh 8421 1726773029.47293: Set connection var ansible_module_compression to ZIP_DEFLATED 8421 1726773029.47300: Set connection var ansible_timeout to 10 8421 1726773029.47302: Set connection var ansible_shell_type to sh 8421 1726773029.47309: Set connection var ansible_pipelining to False 8421 1726773029.47324: variable 'ansible_shell_executable' from source: unknown 8421 1726773029.47327: variable 'ansible_connection' from source: unknown 8421 1726773029.47331: variable 'ansible_module_compression' from source: unknown 8421 1726773029.47334: variable 'ansible_shell_type' from source: unknown 8421 1726773029.47337: variable 'ansible_shell_executable' from source: unknown 8421 1726773029.47340: variable 'ansible_host' from source: host vars for 'managed_node1' 8421 1726773029.47345: variable 'ansible_pipelining' from source: unknown 8421 1726773029.47348: variable 'ansible_timeout' from source: unknown 8421 1726773029.47352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8421 1726773029.47426: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8421 1726773029.47437: variable 'omit' from source: magic vars 8421 1726773029.47442: starting attempt loop 8421 1726773029.47446: running the handler 8421 1726773029.47453: _low_level_execute_command(): starting 8421 1726773029.47457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8421 1726773029.49703: stdout chunk (state=2): >>>/root <<< 8421 1726773029.49824: stderr chunk (state=3): >>><<< 8421 1726773029.49831: stdout chunk (state=3): >>><<< 8421 1726773029.49846: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8421 1726773029.49856: _low_level_execute_command(): starting 8421 1726773029.49861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081 `" && echo ansible-tmp-1726773029.4985294-8421-20830300267081="` echo /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081 `" ) && sleep 0' 8421 1726773029.52372: stdout chunk (state=2): >>>ansible-tmp-1726773029.4985294-8421-20830300267081=/root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081 <<< 8421 1726773029.52494: stderr chunk (state=3): >>><<< 8421 1726773029.52502: stdout chunk (state=3): >>><<< 8421 1726773029.52518: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.4985294-8421-20830300267081=/root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081 , stderr= 8421 1726773029.52548: variable 'ansible_module_compression' from source: unknown 8421 1726773029.52588: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8421 1726773029.52606: variable 'ansible_facts' from source: unknown 8421 1726773029.52662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/AnsiballZ_stat.py 8421 1726773029.52754: Sending initial data 8421 1726773029.52762: Sent initial data (150 bytes) 8421 1726773029.55366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp45723024 /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/AnsiballZ_stat.py <<< 8421 1726773029.56794: stderr chunk (state=3): >>><<< 8421 1726773029.56803: stdout chunk (state=3): >>><<< 8421 1726773029.56822: done transferring module to remote 8421 1726773029.56832: _low_level_execute_command(): starting 8421 1726773029.56837: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/ /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/AnsiballZ_stat.py && sleep 0' 8421 1726773029.59249: stderr chunk (state=2): >>><<< 8421 1726773029.59258: stdout chunk (state=2): >>><<< 8421 1726773029.59272: _low_level_execute_command() done: rc=0, stdout=, stderr= 8421 1726773029.59277: _low_level_execute_command(): starting 8421 1726773029.59282: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/AnsiballZ_stat.py && sleep 0' 8421 1726773029.75095: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726772770.5709214, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 8421 1726773029.76263: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8421 1726773029.76281: stderr chunk (state=3): >>><<< 8421 1726773029.76287: stdout chunk (state=3): >>><<< 8421 1726773029.76297: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned", "mode": "0755", "isdir": true, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 136, "inode": 917919, "dev": 51713, "nlink": 3, "atime": 1726772770.5709214, "mtime": 1716968741.377, "ctime": 1716968741.377, "wusr": true, "rusr": true, "xusr": true, "wgrp": false, "rgrp": true, "xgrp": true, "woth": false, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "pw_name": "root", "gr_name": "root", "mimetype": "inode/directory", "charset": "binary", "version": "1785990601", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} , stderr=Shared connection to 10.31.43.7 closed. 8421 1726773029.76354: done with _execute_module (stat, {'path': '/etc/tuned', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8421 1726773029.76366: _low_level_execute_command(): starting 8421 1726773029.76372: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.4985294-8421-20830300267081/ > /dev/null 2>&1 && sleep 0' 8421 1726773029.78895: stderr chunk (state=2): >>><<< 8421 1726773029.78905: stdout chunk (state=2): >>><<< 8421 1726773029.78921: _low_level_execute_command() done: rc=0, stdout=, stderr= 8421 1726773029.78928: handler run complete 8421 1726773029.78962: attempt loop complete, returning result 8421 1726773029.78978: variable 'item' from source: unknown 8421 1726773029.79039: variable 'item' from source: unknown ok: [managed_node1] => (item=/etc/tuned) => { "ansible_loop_var": "item", "changed": false, "item": "/etc/tuned", "stat": { "atime": 1726772770.5709214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1716968741.377, "dev": 51713, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 917919, "isblk": false, "ischr": false, "isdir": true, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/directory", "mode": "0755", "mtime": 1716968741.377, "nlink": 3, "path": "/etc/tuned", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 136, "uid": 0, "version": "1785990601", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 8421 1726773029.79082: dumping result to json 8421 1726773029.79094: done dumping result, returning 8421 1726773029.79102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory [0affffe7-6841-f581-0619-000000000146] 8421 1726773029.79108: sending task result for task 0affffe7-6841-f581-0619-000000000146 8421 1726773029.79146: done sending task result for task 0affffe7-6841-f581-0619-000000000146 8421 1726773029.79151: WORKER PROCESS EXITING 8208 1726773029.79402: no more pending results, returning what we have 8208 1726773029.79405: results queue empty 8208 1726773029.79406: checking for any_errors_fatal 8208 1726773029.79410: done checking for any_errors_fatal 8208 1726773029.79410: checking for max_fail_percentage 8208 1726773029.79411: done checking for max_fail_percentage 8208 1726773029.79412: checking to see if all hosts have failed and the running result is not ok 8208 1726773029.79412: done checking to see if all hosts have failed 8208 1726773029.79413: getting the remaining hosts for this loop 8208 1726773029.79414: done getting the remaining hosts for this loop 8208 1726773029.79416: getting the next task for host managed_node1 8208 1726773029.79420: done getting next task for host managed_node1 8208 1726773029.79422: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8208 1726773029.79425: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773029.79431: getting variables 8208 1726773029.79432: in VariableManager get_vars() 8208 1726773029.79451: Calling all_inventory to load vars for managed_node1 8208 1726773029.79453: Calling groups_inventory to load vars for managed_node1 8208 1726773029.79454: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773029.79460: Calling all_plugins_play to load vars for managed_node1 8208 1726773029.79462: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773029.79463: Calling groups_plugins_play to load vars for managed_node1 8208 1726773029.79563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773029.79681: done with get_vars() 8208 1726773029.79691: done getting variables 8208 1726773029.79732: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:63 Thursday 19 September 2024 15:10:29 -0400 (0:00:00.680) 0:00:10.471 **** 8208 1726773029.79753: entering _queue_task() for managed_node1/set_fact 8208 1726773029.79922: worker is 1 (out of 1 available) 8208 1726773029.79937: exiting _queue_task() for managed_node1/set_fact 8208 1726773029.79948: done queuing things up, now waiting for results queue to drain 8208 1726773029.79950: waiting for pending results... 8443 1726773029.80064: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir 8443 1726773029.80172: in run() - task 0affffe7-6841-f581-0619-000000000147 8443 1726773029.80190: variable 'ansible_search_path' from source: unknown 8443 1726773029.80194: variable 'ansible_search_path' from source: unknown 8443 1726773029.80224: calling self._execute() 8443 1726773029.80279: variable 'ansible_host' from source: host vars for 'managed_node1' 8443 1726773029.80290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8443 1726773029.80298: variable 'omit' from source: magic vars 8443 1726773029.80390: variable 'omit' from source: magic vars 8443 1726773029.80432: variable 'omit' from source: magic vars 8443 1726773029.80769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8443 1726773029.82500: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8443 1726773029.82549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8443 1726773029.82579: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8443 1726773029.82615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8443 1726773029.82635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8443 1726773029.82694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8443 1726773029.82715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8443 1726773029.82733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8443 1726773029.82762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8443 1726773029.82774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8443 1726773029.82805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8443 1726773029.82819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8443 1726773029.82832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8443 1726773029.82853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8443 1726773029.82860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8443 1726773029.82907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8443 1726773029.82924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8443 1726773029.82940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8443 1726773029.82965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8443 1726773029.82978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8443 1726773029.83134: variable '__kernel_settings_find_profile_dirs' from source: set_fact 8443 1726773029.83204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8443 1726773029.83311: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8443 1726773029.83337: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8443 1726773029.83359: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8443 1726773029.83381: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8443 1726773029.83414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 8443 1726773029.83429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 8443 1726773029.83444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 8443 1726773029.83458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 8443 1726773029.83495: variable 'omit' from source: magic vars 8443 1726773029.83515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8443 1726773029.83536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8443 1726773029.83549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8443 1726773029.83560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8443 1726773029.83567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8443 1726773029.83590: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8443 1726773029.83594: variable 'ansible_host' from source: host vars for 'managed_node1' 8443 1726773029.83597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8443 1726773029.83661: Set connection var ansible_shell_executable to /bin/sh 8443 1726773029.83666: Set connection var ansible_connection to ssh 8443 1726773029.83671: Set connection var ansible_module_compression to ZIP_DEFLATED 8443 1726773029.83676: Set connection var ansible_timeout to 10 8443 1726773029.83677: Set connection var ansible_shell_type to sh 8443 1726773029.83682: Set connection var ansible_pipelining to False 8443 1726773029.83705: variable 'ansible_shell_executable' from source: unknown 8443 1726773029.83710: variable 'ansible_connection' from source: unknown 8443 1726773029.83714: variable 'ansible_module_compression' from source: unknown 8443 1726773029.83717: variable 'ansible_shell_type' from source: unknown 8443 1726773029.83720: variable 'ansible_shell_executable' from source: unknown 8443 1726773029.83723: variable 'ansible_host' from source: host vars for 'managed_node1' 8443 1726773029.83726: variable 'ansible_pipelining' from source: unknown 8443 1726773029.83729: variable 'ansible_timeout' from source: unknown 8443 1726773029.83733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8443 1726773029.83797: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8443 1726773029.83808: variable 'omit' from source: magic vars 8443 1726773029.83814: starting attempt loop 8443 1726773029.83817: running the handler 8443 1726773029.83827: handler run complete 8443 1726773029.83834: attempt loop complete, returning result 8443 1726773029.83837: _execute() done 8443 1726773029.83840: dumping result to json 8443 1726773029.83844: done dumping result, returning 8443 1726773029.83850: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set tuned profile parent dir [0affffe7-6841-f581-0619-000000000147] 8443 1726773029.83856: sending task result for task 0affffe7-6841-f581-0619-000000000147 8443 1726773029.83874: done sending task result for task 0affffe7-6841-f581-0619-000000000147 8443 1726773029.83876: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_profile_parent": "/etc/tuned" }, "changed": false } 8208 1726773029.84014: no more pending results, returning what we have 8208 1726773029.84017: results queue empty 8208 1726773029.84018: checking for any_errors_fatal 8208 1726773029.84026: done checking for any_errors_fatal 8208 1726773029.84026: checking for max_fail_percentage 8208 1726773029.84028: done checking for max_fail_percentage 8208 1726773029.84028: checking to see if all hosts have failed and the running result is not ok 8208 1726773029.84029: done checking to see if all hosts have failed 8208 1726773029.84030: getting the remaining hosts for this loop 8208 1726773029.84031: done getting the remaining hosts for this loop 8208 1726773029.84033: getting the next task for host managed_node1 8208 1726773029.84039: done getting next task for host managed_node1 8208 1726773029.84042: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8208 1726773029.84045: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773029.84054: getting variables 8208 1726773029.84055: in VariableManager get_vars() 8208 1726773029.84091: Calling all_inventory to load vars for managed_node1 8208 1726773029.84094: Calling groups_inventory to load vars for managed_node1 8208 1726773029.84095: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773029.84103: Calling all_plugins_play to load vars for managed_node1 8208 1726773029.84106: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773029.84108: Calling groups_plugins_play to load vars for managed_node1 8208 1726773029.84216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773029.84361: done with get_vars() 8208 1726773029.84372: done getting variables 8208 1726773029.84443: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 Thursday 19 September 2024 15:10:29 -0400 (0:00:00.047) 0:00:10.518 **** 8208 1726773029.84468: entering _queue_task() for managed_node1/service 8208 1726773029.84470: Creating lock for service 8208 1726773029.84644: worker is 1 (out of 1 available) 8208 1726773029.84660: exiting _queue_task() for managed_node1/service 8208 1726773029.84673: done queuing things up, now waiting for results queue to drain 8208 1726773029.84675: waiting for pending results... 8445 1726773029.84788: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started 8445 1726773029.84901: in run() - task 0affffe7-6841-f581-0619-000000000148 8445 1726773029.84916: variable 'ansible_search_path' from source: unknown 8445 1726773029.84920: variable 'ansible_search_path' from source: unknown 8445 1726773029.84952: variable '__kernel_settings_services' from source: include_vars 8445 1726773029.85177: variable '__kernel_settings_services' from source: include_vars 8445 1726773029.85226: variable 'omit' from source: magic vars 8445 1726773029.85306: variable 'ansible_host' from source: host vars for 'managed_node1' 8445 1726773029.85317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8445 1726773029.85327: variable 'omit' from source: magic vars 8445 1726773029.85386: variable 'omit' from source: magic vars 8445 1726773029.85421: variable 'omit' from source: magic vars 8445 1726773029.85457: variable 'item' from source: unknown 8445 1726773029.85515: variable 'item' from source: unknown 8445 1726773029.85532: variable 'omit' from source: magic vars 8445 1726773029.85565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8445 1726773029.85593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8445 1726773029.85610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8445 1726773029.85623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8445 1726773029.85633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8445 1726773029.85657: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8445 1726773029.85661: variable 'ansible_host' from source: host vars for 'managed_node1' 8445 1726773029.85665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8445 1726773029.85731: Set connection var ansible_shell_executable to /bin/sh 8445 1726773029.85737: Set connection var ansible_connection to ssh 8445 1726773029.85743: Set connection var ansible_module_compression to ZIP_DEFLATED 8445 1726773029.85750: Set connection var ansible_timeout to 10 8445 1726773029.85753: Set connection var ansible_shell_type to sh 8445 1726773029.85761: Set connection var ansible_pipelining to False 8445 1726773029.85778: variable 'ansible_shell_executable' from source: unknown 8445 1726773029.85781: variable 'ansible_connection' from source: unknown 8445 1726773029.85792: variable 'ansible_module_compression' from source: unknown 8445 1726773029.85797: variable 'ansible_shell_type' from source: unknown 8445 1726773029.85801: variable 'ansible_shell_executable' from source: unknown 8445 1726773029.85805: variable 'ansible_host' from source: host vars for 'managed_node1' 8445 1726773029.85809: variable 'ansible_pipelining' from source: unknown 8445 1726773029.85812: variable 'ansible_timeout' from source: unknown 8445 1726773029.85817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8445 1726773029.85908: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8445 1726773029.85919: variable 'omit' from source: magic vars 8445 1726773029.85924: starting attempt loop 8445 1726773029.85928: running the handler 8445 1726773029.85991: variable 'ansible_facts' from source: unknown 8445 1726773029.86078: _low_level_execute_command(): starting 8445 1726773029.86102: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8445 1726773029.88566: stdout chunk (state=2): >>>/root <<< 8445 1726773029.88684: stderr chunk (state=3): >>><<< 8445 1726773029.88692: stdout chunk (state=3): >>><<< 8445 1726773029.88713: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8445 1726773029.88726: _low_level_execute_command(): starting 8445 1726773029.88732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570 `" && echo ansible-tmp-1726773029.8872108-8445-22482752781570="` echo /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570 `" ) && sleep 0' 8445 1726773029.91270: stdout chunk (state=2): >>>ansible-tmp-1726773029.8872108-8445-22482752781570=/root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570 <<< 8445 1726773029.91390: stderr chunk (state=3): >>><<< 8445 1726773029.91398: stdout chunk (state=3): >>><<< 8445 1726773029.91414: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773029.8872108-8445-22482752781570=/root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570 , stderr= 8445 1726773029.91441: variable 'ansible_module_compression' from source: unknown 8445 1726773029.91494: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 8445 1726773029.91499: ANSIBALLZ: Acquiring lock 8445 1726773029.91503: ANSIBALLZ: Lock acquired: 139627423671568 8445 1726773029.91508: ANSIBALLZ: Creating module 8445 1726773030.17977: ANSIBALLZ: Writing module into payload 8445 1726773030.18207: ANSIBALLZ: Writing module 8445 1726773030.18240: ANSIBALLZ: Renaming module 8445 1726773030.18249: ANSIBALLZ: Done creating module 8445 1726773030.18273: variable 'ansible_facts' from source: unknown 8445 1726773030.18451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/AnsiballZ_systemd.py 8445 1726773030.18962: Sending initial data 8445 1726773030.18972: Sent initial data (153 bytes) 8445 1726773030.22077: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp6545u3wq /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/AnsiballZ_systemd.py <<< 8445 1726773030.25282: stderr chunk (state=3): >>><<< 8445 1726773030.25294: stdout chunk (state=3): >>><<< 8445 1726773030.25317: done transferring module to remote 8445 1726773030.25330: _low_level_execute_command(): starting 8445 1726773030.25337: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/ /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/AnsiballZ_systemd.py && sleep 0' 8445 1726773030.28039: stderr chunk (state=2): >>><<< 8445 1726773030.28053: stdout chunk (state=2): >>><<< 8445 1726773030.28069: _low_level_execute_command() done: rc=0, stdout=, stderr= 8445 1726773030.28074: _low_level_execute_command(): starting 8445 1726773030.28079: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/AnsiballZ_systemd.py && sleep 0' 8445 1726773030.56195: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh":<<< 8445 1726773030.56545: stdout chunk (state=3): >>> "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8445 1726773030.57796: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8445 1726773030.57839: stderr chunk (state=3): >>><<< 8445 1726773030.57846: stdout chunk (state=3): >>><<< 8445 1726773030.57871: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18628608", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8445 1726773030.57976: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8445 1726773030.57994: _low_level_execute_command(): starting 8445 1726773030.58002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773029.8872108-8445-22482752781570/ > /dev/null 2>&1 && sleep 0' 8445 1726773030.60528: stderr chunk (state=2): >>><<< 8445 1726773030.60538: stdout chunk (state=2): >>><<< 8445 1726773030.60552: _low_level_execute_command() done: rc=0, stdout=, stderr= 8445 1726773030.60561: handler run complete 8445 1726773030.60599: attempt loop complete, returning result 8445 1726773030.60620: variable 'item' from source: unknown 8445 1726773030.60683: variable 'item' from source: unknown ok: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "664", "MemoryAccounting": "yes", "MemoryCurrent": "18628608", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "WatchdogUSec": "0" } } 8445 1726773030.60795: dumping result to json 8445 1726773030.60813: done dumping result, returning 8445 1726773030.60821: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started [0affffe7-6841-f581-0619-000000000148] 8445 1726773030.60828: sending task result for task 0affffe7-6841-f581-0619-000000000148 8445 1726773030.60932: done sending task result for task 0affffe7-6841-f581-0619-000000000148 8445 1726773030.60937: WORKER PROCESS EXITING 8208 1726773030.61271: no more pending results, returning what we have 8208 1726773030.61274: results queue empty 8208 1726773030.61274: checking for any_errors_fatal 8208 1726773030.61277: done checking for any_errors_fatal 8208 1726773030.61277: checking for max_fail_percentage 8208 1726773030.61278: done checking for max_fail_percentage 8208 1726773030.61279: checking to see if all hosts have failed and the running result is not ok 8208 1726773030.61279: done checking to see if all hosts have failed 8208 1726773030.61280: getting the remaining hosts for this loop 8208 1726773030.61280: done getting the remaining hosts for this loop 8208 1726773030.61283: getting the next task for host managed_node1 8208 1726773030.61290: done getting next task for host managed_node1 8208 1726773030.61293: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8208 1726773030.61295: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773030.61302: getting variables 8208 1726773030.61303: in VariableManager get_vars() 8208 1726773030.61324: Calling all_inventory to load vars for managed_node1 8208 1726773030.61326: Calling groups_inventory to load vars for managed_node1 8208 1726773030.61327: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773030.61333: Calling all_plugins_play to load vars for managed_node1 8208 1726773030.61335: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773030.61336: Calling groups_plugins_play to load vars for managed_node1 8208 1726773030.61445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773030.61588: done with get_vars() 8208 1726773030.61597: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 Thursday 19 September 2024 15:10:30 -0400 (0:00:00.772) 0:00:11.290 **** 8208 1726773030.61676: entering _queue_task() for managed_node1/file 8208 1726773030.61882: worker is 1 (out of 1 available) 8208 1726773030.61899: exiting _queue_task() for managed_node1/file 8208 1726773030.61911: done queuing things up, now waiting for results queue to drain 8208 1726773030.61916: waiting for pending results... 8475 1726773030.62140: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists 8475 1726773030.62281: in run() - task 0affffe7-6841-f581-0619-000000000149 8475 1726773030.62301: variable 'ansible_search_path' from source: unknown 8475 1726773030.62305: variable 'ansible_search_path' from source: unknown 8475 1726773030.62337: calling self._execute() 8475 1726773030.62420: variable 'ansible_host' from source: host vars for 'managed_node1' 8475 1726773030.62428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8475 1726773030.62436: variable 'omit' from source: magic vars 8475 1726773030.62533: variable 'omit' from source: magic vars 8475 1726773030.62583: variable 'omit' from source: magic vars 8475 1726773030.62614: variable '__kernel_settings_profile_dir' from source: role '' all vars 8475 1726773030.62867: variable '__kernel_settings_profile_dir' from source: role '' all vars 8475 1726773030.62944: variable '__kernel_settings_profile_parent' from source: set_fact 8475 1726773030.62951: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8475 1726773030.62977: variable 'omit' from source: magic vars 8475 1726773030.63078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8475 1726773030.63112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8475 1726773030.63128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8475 1726773030.63142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8475 1726773030.63153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8475 1726773030.63174: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8475 1726773030.63179: variable 'ansible_host' from source: host vars for 'managed_node1' 8475 1726773030.63181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8475 1726773030.63253: Set connection var ansible_shell_executable to /bin/sh 8475 1726773030.63259: Set connection var ansible_connection to ssh 8475 1726773030.63265: Set connection var ansible_module_compression to ZIP_DEFLATED 8475 1726773030.63274: Set connection var ansible_timeout to 10 8475 1726773030.63277: Set connection var ansible_shell_type to sh 8475 1726773030.63283: Set connection var ansible_pipelining to False 8475 1726773030.63302: variable 'ansible_shell_executable' from source: unknown 8475 1726773030.63305: variable 'ansible_connection' from source: unknown 8475 1726773030.63310: variable 'ansible_module_compression' from source: unknown 8475 1726773030.63314: variable 'ansible_shell_type' from source: unknown 8475 1726773030.63317: variable 'ansible_shell_executable' from source: unknown 8475 1726773030.63321: variable 'ansible_host' from source: host vars for 'managed_node1' 8475 1726773030.63326: variable 'ansible_pipelining' from source: unknown 8475 1726773030.63329: variable 'ansible_timeout' from source: unknown 8475 1726773030.63333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8475 1726773030.63504: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8475 1726773030.63516: variable 'omit' from source: magic vars 8475 1726773030.63521: starting attempt loop 8475 1726773030.63525: running the handler 8475 1726773030.63538: _low_level_execute_command(): starting 8475 1726773030.63546: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8475 1726773030.66724: stdout chunk (state=2): >>>/root <<< 8475 1726773030.66845: stderr chunk (state=3): >>><<< 8475 1726773030.66861: stdout chunk (state=3): >>><<< 8475 1726773030.66887: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8475 1726773030.66901: _low_level_execute_command(): starting 8475 1726773030.66908: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381 `" && echo ansible-tmp-1726773030.6689622-8475-167360567704381="` echo /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381 `" ) && sleep 0' 8475 1726773030.69495: stdout chunk (state=2): >>>ansible-tmp-1726773030.6689622-8475-167360567704381=/root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381 <<< 8475 1726773030.69619: stderr chunk (state=3): >>><<< 8475 1726773030.69626: stdout chunk (state=3): >>><<< 8475 1726773030.69644: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773030.6689622-8475-167360567704381=/root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381 , stderr= 8475 1726773030.69690: variable 'ansible_module_compression' from source: unknown 8475 1726773030.69734: ANSIBALLZ: Using lock for file 8475 1726773030.69739: ANSIBALLZ: Acquiring lock 8475 1726773030.69743: ANSIBALLZ: Lock acquired: 139627422454256 8475 1726773030.69747: ANSIBALLZ: Creating module 8475 1726773030.79455: ANSIBALLZ: Writing module into payload 8475 1726773030.79610: ANSIBALLZ: Writing module 8475 1726773030.79631: ANSIBALLZ: Renaming module 8475 1726773030.79638: ANSIBALLZ: Done creating module 8475 1726773030.79653: variable 'ansible_facts' from source: unknown 8475 1726773030.79711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/AnsiballZ_file.py 8475 1726773030.79811: Sending initial data 8475 1726773030.79818: Sent initial data (151 bytes) 8475 1726773030.82796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmps1__qol2 /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/AnsiballZ_file.py <<< 8475 1726773030.84082: stderr chunk (state=3): >>><<< 8475 1726773030.84093: stdout chunk (state=3): >>><<< 8475 1726773030.84114: done transferring module to remote 8475 1726773030.84126: _low_level_execute_command(): starting 8475 1726773030.84132: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/ /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/AnsiballZ_file.py && sleep 0' 8475 1726773030.86597: stderr chunk (state=2): >>><<< 8475 1726773030.86607: stdout chunk (state=2): >>><<< 8475 1726773030.86622: _low_level_execute_command() done: rc=0, stdout=, stderr= 8475 1726773030.86626: _low_level_execute_command(): starting 8475 1726773030.86633: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/AnsiballZ_file.py && sleep 0' 8475 1726773031.02809: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8475 1726773031.03877: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8475 1726773031.03891: stdout chunk (state=3): >>><<< 8475 1726773031.03903: stderr chunk (state=3): >>><<< 8475 1726773031.03918: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "absent"}, "after": {"path": "/etc/tuned/kernel_settings", "state": "directory"}}, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0755", "state": "directory", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "directory", "mode": "0755", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8475 1726773031.03962: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'directory', 'mode': '0755', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8475 1726773031.03978: _low_level_execute_command(): starting 8475 1726773031.03985: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773030.6689622-8475-167360567704381/ > /dev/null 2>&1 && sleep 0' 8475 1726773031.07428: stderr chunk (state=2): >>><<< 8475 1726773031.07441: stdout chunk (state=2): >>><<< 8475 1726773031.07459: _low_level_execute_command() done: rc=0, stdout=, stderr= 8475 1726773031.07467: handler run complete 8475 1726773031.07496: attempt loop complete, returning result 8475 1726773031.07502: _execute() done 8475 1726773031.07506: dumping result to json 8475 1726773031.07511: done dumping result, returning 8475 1726773031.07520: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists [0affffe7-6841-f581-0619-000000000149] 8475 1726773031.07526: sending task result for task 0affffe7-6841-f581-0619-000000000149 8475 1726773031.07571: done sending task result for task 0affffe7-6841-f581-0619-000000000149 8475 1726773031.07575: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/tuned/kernel_settings", "secontext": "unconfined_u:object_r:tuned_etc_t:s0", "size": 6, "state": "directory", "uid": 0 } 8208 1726773031.08081: no more pending results, returning what we have 8208 1726773031.08084: results queue empty 8208 1726773031.08089: checking for any_errors_fatal 8208 1726773031.08103: done checking for any_errors_fatal 8208 1726773031.08104: checking for max_fail_percentage 8208 1726773031.08106: done checking for max_fail_percentage 8208 1726773031.08107: checking to see if all hosts have failed and the running result is not ok 8208 1726773031.08108: done checking to see if all hosts have failed 8208 1726773031.08108: getting the remaining hosts for this loop 8208 1726773031.08110: done getting the remaining hosts for this loop 8208 1726773031.08113: getting the next task for host managed_node1 8208 1726773031.08119: done getting next task for host managed_node1 8208 1726773031.08122: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8208 1726773031.08126: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773031.08136: getting variables 8208 1726773031.08138: in VariableManager get_vars() 8208 1726773031.08172: Calling all_inventory to load vars for managed_node1 8208 1726773031.08175: Calling groups_inventory to load vars for managed_node1 8208 1726773031.08177: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773031.08187: Calling all_plugins_play to load vars for managed_node1 8208 1726773031.08190: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773031.08193: Calling groups_plugins_play to load vars for managed_node1 8208 1726773031.08356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773031.08557: done with get_vars() 8208 1726773031.08568: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.469) 0:00:11.760 **** 8208 1726773031.08664: entering _queue_task() for managed_node1/slurp 8208 1726773031.08917: worker is 1 (out of 1 available) 8208 1726773031.08933: exiting _queue_task() for managed_node1/slurp 8208 1726773031.08943: done queuing things up, now waiting for results queue to drain 8208 1726773031.08945: waiting for pending results... 8497 1726773031.09161: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile 8497 1726773031.09314: in run() - task 0affffe7-6841-f581-0619-00000000014a 8497 1726773031.09333: variable 'ansible_search_path' from source: unknown 8497 1726773031.09343: variable 'ansible_search_path' from source: unknown 8497 1726773031.09375: calling self._execute() 8497 1726773031.09455: variable 'ansible_host' from source: host vars for 'managed_node1' 8497 1726773031.09464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8497 1726773031.09473: variable 'omit' from source: magic vars 8497 1726773031.09549: variable 'omit' from source: magic vars 8497 1726773031.09594: variable 'omit' from source: magic vars 8497 1726773031.09615: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8497 1726773031.09831: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8497 1726773031.09891: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8497 1726773031.09916: variable 'omit' from source: magic vars 8497 1726773031.09948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8497 1726773031.09973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8497 1726773031.10003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8497 1726773031.10020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8497 1726773031.10032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8497 1726773031.10055: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8497 1726773031.10059: variable 'ansible_host' from source: host vars for 'managed_node1' 8497 1726773031.10062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8497 1726773031.10128: Set connection var ansible_shell_executable to /bin/sh 8497 1726773031.10133: Set connection var ansible_connection to ssh 8497 1726773031.10137: Set connection var ansible_module_compression to ZIP_DEFLATED 8497 1726773031.10143: Set connection var ansible_timeout to 10 8497 1726773031.10145: Set connection var ansible_shell_type to sh 8497 1726773031.10150: Set connection var ansible_pipelining to False 8497 1726773031.10165: variable 'ansible_shell_executable' from source: unknown 8497 1726773031.10168: variable 'ansible_connection' from source: unknown 8497 1726773031.10170: variable 'ansible_module_compression' from source: unknown 8497 1726773031.10172: variable 'ansible_shell_type' from source: unknown 8497 1726773031.10173: variable 'ansible_shell_executable' from source: unknown 8497 1726773031.10175: variable 'ansible_host' from source: host vars for 'managed_node1' 8497 1726773031.10177: variable 'ansible_pipelining' from source: unknown 8497 1726773031.10179: variable 'ansible_timeout' from source: unknown 8497 1726773031.10181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8497 1726773031.10329: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8497 1726773031.10341: variable 'omit' from source: magic vars 8497 1726773031.10347: starting attempt loop 8497 1726773031.10351: running the handler 8497 1726773031.10362: _low_level_execute_command(): starting 8497 1726773031.10369: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8497 1726773031.14196: stdout chunk (state=2): >>>/root <<< 8497 1726773031.14313: stderr chunk (state=3): >>><<< 8497 1726773031.14324: stdout chunk (state=3): >>><<< 8497 1726773031.14345: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8497 1726773031.14359: _low_level_execute_command(): starting 8497 1726773031.14366: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326 `" && echo ansible-tmp-1726773031.1435397-8497-247462383994326="` echo /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326 `" ) && sleep 0' 8497 1726773031.16992: stdout chunk (state=2): >>>ansible-tmp-1726773031.1435397-8497-247462383994326=/root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326 <<< 8497 1726773031.17392: stderr chunk (state=3): >>><<< 8497 1726773031.17402: stdout chunk (state=3): >>><<< 8497 1726773031.17421: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.1435397-8497-247462383994326=/root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326 , stderr= 8497 1726773031.17471: variable 'ansible_module_compression' from source: unknown 8497 1726773031.17518: ANSIBALLZ: Using lock for slurp 8497 1726773031.17524: ANSIBALLZ: Acquiring lock 8497 1726773031.17528: ANSIBALLZ: Lock acquired: 139627422453296 8497 1726773031.17532: ANSIBALLZ: Creating module 8497 1726773031.30625: ANSIBALLZ: Writing module into payload 8497 1726773031.30678: ANSIBALLZ: Writing module 8497 1726773031.30699: ANSIBALLZ: Renaming module 8497 1726773031.30706: ANSIBALLZ: Done creating module 8497 1726773031.30723: variable 'ansible_facts' from source: unknown 8497 1726773031.30777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/AnsiballZ_slurp.py 8497 1726773031.30881: Sending initial data 8497 1726773031.30890: Sent initial data (152 bytes) 8497 1726773031.34693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpauybhz8n /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/AnsiballZ_slurp.py <<< 8497 1726773031.35949: stderr chunk (state=3): >>><<< 8497 1726773031.35960: stdout chunk (state=3): >>><<< 8497 1726773031.35987: done transferring module to remote 8497 1726773031.36000: _low_level_execute_command(): starting 8497 1726773031.36005: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/ /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/AnsiballZ_slurp.py && sleep 0' 8497 1726773031.38793: stderr chunk (state=2): >>><<< 8497 1726773031.38805: stdout chunk (state=2): >>><<< 8497 1726773031.38821: _low_level_execute_command() done: rc=0, stdout=, stderr= 8497 1726773031.38826: _low_level_execute_command(): starting 8497 1726773031.38831: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/AnsiballZ_slurp.py && sleep 0' 8497 1726773031.54403: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8497 1726773031.54755: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8497 1726773031.54769: stdout chunk (state=3): >>><<< 8497 1726773031.54780: stderr chunk (state=3): >>><<< 8497 1726773031.54795: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdAo=", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.43.7 closed. 8497 1726773031.54820: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8497 1726773031.54833: _low_level_execute_command(): starting 8497 1726773031.54839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.1435397-8497-247462383994326/ > /dev/null 2>&1 && sleep 0' 8497 1726773031.58522: stderr chunk (state=2): >>><<< 8497 1726773031.58534: stdout chunk (state=2): >>><<< 8497 1726773031.58551: _low_level_execute_command() done: rc=0, stdout=, stderr= 8497 1726773031.58562: handler run complete 8497 1726773031.58583: attempt loop complete, returning result 8497 1726773031.58590: _execute() done 8497 1726773031.58594: dumping result to json 8497 1726773031.58598: done dumping result, returning 8497 1726773031.58606: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get active_profile [0affffe7-6841-f581-0619-00000000014a] 8497 1726773031.58613: sending task result for task 0affffe7-6841-f581-0619-00000000014a 8497 1726773031.58648: done sending task result for task 0affffe7-6841-f581-0619-00000000014a 8497 1726773031.58652: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "content": "dmlydHVhbC1ndWVzdAo=", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8208 1726773031.59136: no more pending results, returning what we have 8208 1726773031.59139: results queue empty 8208 1726773031.59140: checking for any_errors_fatal 8208 1726773031.59149: done checking for any_errors_fatal 8208 1726773031.59149: checking for max_fail_percentage 8208 1726773031.59150: done checking for max_fail_percentage 8208 1726773031.59151: checking to see if all hosts have failed and the running result is not ok 8208 1726773031.59152: done checking to see if all hosts have failed 8208 1726773031.59152: getting the remaining hosts for this loop 8208 1726773031.59154: done getting the remaining hosts for this loop 8208 1726773031.59157: getting the next task for host managed_node1 8208 1726773031.59163: done getting next task for host managed_node1 8208 1726773031.59169: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8208 1726773031.59172: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773031.59183: getting variables 8208 1726773031.59186: in VariableManager get_vars() 8208 1726773031.59220: Calling all_inventory to load vars for managed_node1 8208 1726773031.59223: Calling groups_inventory to load vars for managed_node1 8208 1726773031.59225: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773031.59233: Calling all_plugins_play to load vars for managed_node1 8208 1726773031.59236: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773031.59239: Calling groups_plugins_play to load vars for managed_node1 8208 1726773031.59410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773031.59623: done with get_vars() 8208 1726773031.59635: done getting variables 8208 1726773031.59693: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set active_profile] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:85 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.510) 0:00:12.271 **** 8208 1726773031.59724: entering _queue_task() for managed_node1/set_fact 8208 1726773031.59957: worker is 1 (out of 1 available) 8208 1726773031.59974: exiting _queue_task() for managed_node1/set_fact 8208 1726773031.59989: done queuing things up, now waiting for results queue to drain 8208 1726773031.59991: waiting for pending results... 8524 1726773031.61218: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile 8524 1726773031.61362: in run() - task 0affffe7-6841-f581-0619-00000000014b 8524 1726773031.61384: variable 'ansible_search_path' from source: unknown 8524 1726773031.61391: variable 'ansible_search_path' from source: unknown 8524 1726773031.61427: calling self._execute() 8524 1726773031.61604: variable 'ansible_host' from source: host vars for 'managed_node1' 8524 1726773031.61614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8524 1726773031.61624: variable 'omit' from source: magic vars 8524 1726773031.61727: variable 'omit' from source: magic vars 8524 1726773031.61772: variable 'omit' from source: magic vars 8524 1726773031.63236: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8524 1726773031.63248: variable '__cur_profile' from source: task vars 8524 1726773031.63398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8524 1726773031.66682: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8524 1726773031.66763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8524 1726773031.66804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8524 1726773031.66836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8524 1726773031.66860: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8524 1726773031.66936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8524 1726773031.66969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8524 1726773031.66996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8524 1726773031.67039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8524 1726773031.67054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8524 1726773031.67178: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8524 1726773031.67224: variable 'omit' from source: magic vars 8524 1726773031.67250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8524 1726773031.67277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8524 1726773031.67297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8524 1726773031.67313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8524 1726773031.67323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8524 1726773031.67350: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8524 1726773031.67356: variable 'ansible_host' from source: host vars for 'managed_node1' 8524 1726773031.67359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8524 1726773031.67453: Set connection var ansible_shell_executable to /bin/sh 8524 1726773031.67459: Set connection var ansible_connection to ssh 8524 1726773031.67467: Set connection var ansible_module_compression to ZIP_DEFLATED 8524 1726773031.67476: Set connection var ansible_timeout to 10 8524 1726773031.67479: Set connection var ansible_shell_type to sh 8524 1726773031.67581: Set connection var ansible_pipelining to False 8524 1726773031.67610: variable 'ansible_shell_executable' from source: unknown 8524 1726773031.67615: variable 'ansible_connection' from source: unknown 8524 1726773031.67618: variable 'ansible_module_compression' from source: unknown 8524 1726773031.67620: variable 'ansible_shell_type' from source: unknown 8524 1726773031.67623: variable 'ansible_shell_executable' from source: unknown 8524 1726773031.67625: variable 'ansible_host' from source: host vars for 'managed_node1' 8524 1726773031.67629: variable 'ansible_pipelining' from source: unknown 8524 1726773031.67631: variable 'ansible_timeout' from source: unknown 8524 1726773031.67635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8524 1726773031.67725: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8524 1726773031.67737: variable 'omit' from source: magic vars 8524 1726773031.67742: starting attempt loop 8524 1726773031.67745: running the handler 8524 1726773031.67755: handler run complete 8524 1726773031.67763: attempt loop complete, returning result 8524 1726773031.67769: _execute() done 8524 1726773031.67772: dumping result to json 8524 1726773031.67775: done dumping result, returning 8524 1726773031.67782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set active_profile [0affffe7-6841-f581-0619-00000000014b] 8524 1726773031.67790: sending task result for task 0affffe7-6841-f581-0619-00000000014b 8524 1726773031.67812: done sending task result for task 0affffe7-6841-f581-0619-00000000014b 8524 1726773031.67815: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_active_profile": "virtual-guest kernel_settings" }, "changed": false } 8208 1726773031.68357: no more pending results, returning what we have 8208 1726773031.68360: results queue empty 8208 1726773031.68361: checking for any_errors_fatal 8208 1726773031.68367: done checking for any_errors_fatal 8208 1726773031.68368: checking for max_fail_percentage 8208 1726773031.68369: done checking for max_fail_percentage 8208 1726773031.68370: checking to see if all hosts have failed and the running result is not ok 8208 1726773031.68370: done checking to see if all hosts have failed 8208 1726773031.68371: getting the remaining hosts for this loop 8208 1726773031.68372: done getting the remaining hosts for this loop 8208 1726773031.68375: getting the next task for host managed_node1 8208 1726773031.68380: done getting next task for host managed_node1 8208 1726773031.68383: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8208 1726773031.68388: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773031.68402: getting variables 8208 1726773031.68404: in VariableManager get_vars() 8208 1726773031.68430: Calling all_inventory to load vars for managed_node1 8208 1726773031.68432: Calling groups_inventory to load vars for managed_node1 8208 1726773031.68434: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773031.68442: Calling all_plugins_play to load vars for managed_node1 8208 1726773031.68444: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773031.68446: Calling groups_plugins_play to load vars for managed_node1 8208 1726773031.68598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773031.68808: done with get_vars() 8208 1726773031.68818: done getting variables 8208 1726773031.68941: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 Thursday 19 September 2024 15:10:31 -0400 (0:00:00.092) 0:00:12.364 **** 8208 1726773031.68976: entering _queue_task() for managed_node1/copy 8208 1726773031.69209: worker is 1 (out of 1 available) 8208 1726773031.69224: exiting _queue_task() for managed_node1/copy 8208 1726773031.69235: done queuing things up, now waiting for results queue to drain 8208 1726773031.69238: waiting for pending results... 8527 1726773031.69491: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile 8527 1726773031.69624: in run() - task 0affffe7-6841-f581-0619-00000000014c 8527 1726773031.69643: variable 'ansible_search_path' from source: unknown 8527 1726773031.69648: variable 'ansible_search_path' from source: unknown 8527 1726773031.69687: calling self._execute() 8527 1726773031.69770: variable 'ansible_host' from source: host vars for 'managed_node1' 8527 1726773031.69777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8527 1726773031.69782: variable 'omit' from source: magic vars 8527 1726773031.69876: variable 'omit' from source: magic vars 8527 1726773031.69926: variable 'omit' from source: magic vars 8527 1726773031.69946: variable '__kernel_settings_active_profile' from source: set_fact 8527 1726773031.70221: variable '__kernel_settings_active_profile' from source: set_fact 8527 1726773031.70249: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8527 1726773031.70340: variable '__kernel_settings_tuned_active_profile' from source: role '' all vars 8527 1726773031.70401: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8527 1726773031.70422: variable 'omit' from source: magic vars 8527 1726773031.70452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8527 1726773031.70479: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8527 1726773031.70495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8527 1726773031.70506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8527 1726773031.70515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8527 1726773031.70539: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8527 1726773031.70543: variable 'ansible_host' from source: host vars for 'managed_node1' 8527 1726773031.70546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8527 1726773031.70625: Set connection var ansible_shell_executable to /bin/sh 8527 1726773031.70632: Set connection var ansible_connection to ssh 8527 1726773031.70639: Set connection var ansible_module_compression to ZIP_DEFLATED 8527 1726773031.70648: Set connection var ansible_timeout to 10 8527 1726773031.70651: Set connection var ansible_shell_type to sh 8527 1726773031.70659: Set connection var ansible_pipelining to False 8527 1726773031.70687: variable 'ansible_shell_executable' from source: unknown 8527 1726773031.70692: variable 'ansible_connection' from source: unknown 8527 1726773031.70695: variable 'ansible_module_compression' from source: unknown 8527 1726773031.70698: variable 'ansible_shell_type' from source: unknown 8527 1726773031.70701: variable 'ansible_shell_executable' from source: unknown 8527 1726773031.70704: variable 'ansible_host' from source: host vars for 'managed_node1' 8527 1726773031.70708: variable 'ansible_pipelining' from source: unknown 8527 1726773031.70711: variable 'ansible_timeout' from source: unknown 8527 1726773031.70715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8527 1726773031.70878: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8527 1726773031.70892: variable 'omit' from source: magic vars 8527 1726773031.70898: starting attempt loop 8527 1726773031.70901: running the handler 8527 1726773031.70911: _low_level_execute_command(): starting 8527 1726773031.70919: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8527 1726773031.73545: stdout chunk (state=2): >>>/root <<< 8527 1726773031.73916: stderr chunk (state=3): >>><<< 8527 1726773031.73925: stdout chunk (state=3): >>><<< 8527 1726773031.73950: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8527 1726773031.73963: _low_level_execute_command(): starting 8527 1726773031.73968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433 `" && echo ansible-tmp-1726773031.7395835-8527-210680745492433="` echo /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433 `" ) && sleep 0' 8527 1726773031.76922: stdout chunk (state=2): >>>ansible-tmp-1726773031.7395835-8527-210680745492433=/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433 <<< 8527 1726773031.77253: stderr chunk (state=3): >>><<< 8527 1726773031.77262: stdout chunk (state=3): >>><<< 8527 1726773031.77281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773031.7395835-8527-210680745492433=/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433 , stderr= 8527 1726773031.77375: variable 'ansible_module_compression' from source: unknown 8527 1726773031.77435: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8527 1726773031.77466: variable 'ansible_facts' from source: unknown 8527 1726773031.77552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_stat.py 8527 1726773031.77801: Sending initial data 8527 1726773031.77808: Sent initial data (151 bytes) 8527 1726773031.80591: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpks00pnz1 /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_stat.py <<< 8527 1726773031.82115: stderr chunk (state=3): >>><<< 8527 1726773031.82123: stdout chunk (state=3): >>><<< 8527 1726773031.82144: done transferring module to remote 8527 1726773031.82156: _low_level_execute_command(): starting 8527 1726773031.82161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/ /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_stat.py && sleep 0' 8527 1726773031.84532: stderr chunk (state=2): >>><<< 8527 1726773031.84541: stdout chunk (state=2): >>><<< 8527 1726773031.84555: _low_level_execute_command() done: rc=0, stdout=, stderr= 8527 1726773031.84560: _low_level_execute_command(): starting 8527 1726773031.84567: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_stat.py && sleep 0' 8527 1726773032.00495: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773031.5354335, "mtime": 1726772770.8669214, "ctime": 1726772770.8669214, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8527 1726773032.01737: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8527 1726773032.01755: stdout chunk (state=3): >>><<< 8527 1726773032.01767: stderr chunk (state=3): >>><<< 8527 1726773032.01783: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 14, "inode": 918059, "dev": 51713, "nlink": 1, "atime": 1726773031.5354335, "mtime": 1726772770.8669214, "ctime": 1726772770.8669214, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "mimetype": "text/plain", "charset": "us-ascii", "version": "3521309722", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.43.7 closed. 8527 1726773032.01829: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8527 1726773032.01912: Sending initial data 8527 1726773032.01924: Sent initial data (140 bytes) 8527 1726773032.04613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp7owd1lpu /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source <<< 8527 1726773032.05261: stderr chunk (state=3): >>><<< 8527 1726773032.05270: stdout chunk (state=3): >>><<< 8527 1726773032.05295: _low_level_execute_command(): starting 8527 1726773032.05302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/ /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source && sleep 0' 8527 1726773032.07743: stderr chunk (state=2): >>><<< 8527 1726773032.07752: stdout chunk (state=2): >>><<< 8527 1726773032.07770: _low_level_execute_command() done: rc=0, stdout=, stderr= 8527 1726773032.07794: variable 'ansible_module_compression' from source: unknown 8527 1726773032.07830: ANSIBALLZ: Using generic lock for ansible.legacy.copy 8527 1726773032.07835: ANSIBALLZ: Acquiring lock 8527 1726773032.07839: ANSIBALLZ: Lock acquired: 139627423671568 8527 1726773032.07843: ANSIBALLZ: Creating module 8527 1726773032.19628: ANSIBALLZ: Writing module into payload 8527 1726773032.19764: ANSIBALLZ: Writing module 8527 1726773032.19789: ANSIBALLZ: Renaming module 8527 1726773032.19799: ANSIBALLZ: Done creating module 8527 1726773032.19812: variable 'ansible_facts' from source: unknown 8527 1726773032.19867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_copy.py 8527 1726773032.19966: Sending initial data 8527 1726773032.19972: Sent initial data (151 bytes) 8527 1726773032.22743: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp3_csvve5 /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_copy.py <<< 8527 1726773032.24183: stderr chunk (state=3): >>><<< 8527 1726773032.24195: stdout chunk (state=3): >>><<< 8527 1726773032.24216: done transferring module to remote 8527 1726773032.24226: _low_level_execute_command(): starting 8527 1726773032.24231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/ /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_copy.py && sleep 0' 8527 1726773032.26662: stderr chunk (state=2): >>><<< 8527 1726773032.26678: stdout chunk (state=3): >>><<< 8527 1726773032.26690: _low_level_execute_command() done: rc=0, stdout=, stderr= 8527 1726773032.26695: _low_level_execute_command(): starting 8527 1726773032.26700: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/AnsiballZ_copy.py && sleep 0' 8527 1726773032.42661: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source", "_original_basename": "tmp7owd1lpu", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8527 1726773032.43929: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8527 1726773032.43943: stdout chunk (state=3): >>><<< 8527 1726773032.43955: stderr chunk (state=3): >>><<< 8527 1726773032.43969: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source", "_original_basename": "tmp7owd1lpu", "follow": false, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8527 1726773032.44008: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source', '_original_basename': 'tmp7owd1lpu', 'follow': False, 'checksum': 'a79569d3860cb6a066e0e92c8b22ffd0e8796bfd', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8527 1726773032.44021: _low_level_execute_command(): starting 8527 1726773032.44026: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/ > /dev/null 2>&1 && sleep 0' 8527 1726773032.46746: stderr chunk (state=2): >>><<< 8527 1726773032.46759: stdout chunk (state=2): >>><<< 8527 1726773032.46777: _low_level_execute_command() done: rc=0, stdout=, stderr= 8527 1726773032.46787: handler run complete 8527 1726773032.46814: attempt loop complete, returning result 8527 1726773032.46818: _execute() done 8527 1726773032.46821: dumping result to json 8527 1726773032.46827: done dumping result, returning 8527 1726773032.46834: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile [0affffe7-6841-f581-0619-00000000014c] 8527 1726773032.46840: sending task result for task 0affffe7-6841-f581-0619-00000000014c 8527 1726773032.46883: done sending task result for task 0affffe7-6841-f581-0619-00000000014c 8527 1726773032.46889: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "8d80fe3f09ba4b9ac8d7fd5e8541a324", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 30, "src": "/root/.ansible/tmp/ansible-tmp-1726773031.7395835-8527-210680745492433/source", "state": "file", "uid": 0 } 8208 1726773032.47381: no more pending results, returning what we have 8208 1726773032.47389: results queue empty 8208 1726773032.47390: checking for any_errors_fatal 8208 1726773032.47399: done checking for any_errors_fatal 8208 1726773032.47399: checking for max_fail_percentage 8208 1726773032.47401: done checking for max_fail_percentage 8208 1726773032.47401: checking to see if all hosts have failed and the running result is not ok 8208 1726773032.47402: done checking to see if all hosts have failed 8208 1726773032.47403: getting the remaining hosts for this loop 8208 1726773032.47404: done getting the remaining hosts for this loop 8208 1726773032.47407: getting the next task for host managed_node1 8208 1726773032.47413: done getting next task for host managed_node1 8208 1726773032.47416: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8208 1726773032.47420: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773032.47429: getting variables 8208 1726773032.47430: in VariableManager get_vars() 8208 1726773032.47464: Calling all_inventory to load vars for managed_node1 8208 1726773032.47469: Calling groups_inventory to load vars for managed_node1 8208 1726773032.47471: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773032.47480: Calling all_plugins_play to load vars for managed_node1 8208 1726773032.47482: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773032.47486: Calling groups_plugins_play to load vars for managed_node1 8208 1726773032.47692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773032.47902: done with get_vars() 8208 1726773032.47912: done getting variables 8208 1726773032.47969: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set profile_mode to manual] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 Thursday 19 September 2024 15:10:32 -0400 (0:00:00.790) 0:00:13.154 **** 8208 1726773032.48003: entering _queue_task() for managed_node1/copy 8208 1726773032.48233: worker is 1 (out of 1 available) 8208 1726773032.48249: exiting _queue_task() for managed_node1/copy 8208 1726773032.48260: done queuing things up, now waiting for results queue to drain 8208 1726773032.48261: waiting for pending results... 8570 1726773032.48594: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual 8570 1726773032.48739: in run() - task 0affffe7-6841-f581-0619-00000000014d 8570 1726773032.48758: variable 'ansible_search_path' from source: unknown 8570 1726773032.48762: variable 'ansible_search_path' from source: unknown 8570 1726773032.48798: calling self._execute() 8570 1726773032.48875: variable 'ansible_host' from source: host vars for 'managed_node1' 8570 1726773032.48886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8570 1726773032.48895: variable 'omit' from source: magic vars 8570 1726773032.48988: variable 'omit' from source: magic vars 8570 1726773032.49037: variable 'omit' from source: magic vars 8570 1726773032.49062: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8570 1726773032.49332: variable '__kernel_settings_tuned_profile_mode' from source: role '' all vars 8570 1726773032.49404: variable '__kernel_settings_tuned_dir' from source: role '' all vars 8570 1726773032.49436: variable 'omit' from source: magic vars 8570 1726773032.49477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8570 1726773032.49564: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8570 1726773032.49587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8570 1726773032.49606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8570 1726773032.49619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8570 1726773032.49648: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8570 1726773032.49654: variable 'ansible_host' from source: host vars for 'managed_node1' 8570 1726773032.49658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8570 1726773032.49752: Set connection var ansible_shell_executable to /bin/sh 8570 1726773032.49757: Set connection var ansible_connection to ssh 8570 1726773032.49761: Set connection var ansible_module_compression to ZIP_DEFLATED 8570 1726773032.49767: Set connection var ansible_timeout to 10 8570 1726773032.49769: Set connection var ansible_shell_type to sh 8570 1726773032.49774: Set connection var ansible_pipelining to False 8570 1726773032.49793: variable 'ansible_shell_executable' from source: unknown 8570 1726773032.49797: variable 'ansible_connection' from source: unknown 8570 1726773032.49801: variable 'ansible_module_compression' from source: unknown 8570 1726773032.49804: variable 'ansible_shell_type' from source: unknown 8570 1726773032.49807: variable 'ansible_shell_executable' from source: unknown 8570 1726773032.49810: variable 'ansible_host' from source: host vars for 'managed_node1' 8570 1726773032.49814: variable 'ansible_pipelining' from source: unknown 8570 1726773032.49817: variable 'ansible_timeout' from source: unknown 8570 1726773032.49821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8570 1726773032.49946: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8570 1726773032.49958: variable 'omit' from source: magic vars 8570 1726773032.49964: starting attempt loop 8570 1726773032.49967: running the handler 8570 1726773032.49979: _low_level_execute_command(): starting 8570 1726773032.49989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8570 1726773032.53256: stdout chunk (state=2): >>>/root <<< 8570 1726773032.53454: stderr chunk (state=3): >>><<< 8570 1726773032.53462: stdout chunk (state=3): >>><<< 8570 1726773032.53484: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8570 1726773032.53503: _low_level_execute_command(): starting 8570 1726773032.53509: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801 `" && echo ansible-tmp-1726773032.5349584-8570-92547000678801="` echo /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801 `" ) && sleep 0' 8570 1726773032.58666: stdout chunk (state=2): >>>ansible-tmp-1726773032.5349584-8570-92547000678801=/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801 <<< 8570 1726773032.58828: stderr chunk (state=3): >>><<< 8570 1726773032.58836: stdout chunk (state=3): >>><<< 8570 1726773032.58856: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773032.5349584-8570-92547000678801=/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801 , stderr= 8570 1726773032.58952: variable 'ansible_module_compression' from source: unknown 8570 1726773032.59015: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8570 1726773032.59053: variable 'ansible_facts' from source: unknown 8570 1726773032.59154: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_stat.py 8570 1726773032.59634: Sending initial data 8570 1726773032.59641: Sent initial data (150 bytes) 8570 1726773032.62893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmprdzj66wm /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_stat.py <<< 8570 1726773032.65080: stderr chunk (state=3): >>><<< 8570 1726773032.65093: stdout chunk (state=3): >>><<< 8570 1726773032.65115: done transferring module to remote 8570 1726773032.65128: _low_level_execute_command(): starting 8570 1726773032.65136: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/ /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_stat.py && sleep 0' 8570 1726773032.68079: stderr chunk (state=2): >>><<< 8570 1726773032.68091: stdout chunk (state=2): >>><<< 8570 1726773032.68108: _low_level_execute_command() done: rc=0, stdout=, stderr= 8570 1726773032.68114: _low_level_execute_command(): starting 8570 1726773032.68120: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_stat.py && sleep 0' 8570 1726773032.84418: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726772770.3489213, "mtime": 1726772770.8669214, "ctime": 1726772770.8669214, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8570 1726773032.85593: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8570 1726773032.85621: stderr chunk (state=3): >>><<< 8570 1726773032.85635: stdout chunk (state=3): >>><<< 8570 1726773032.85649: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0644", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 5, "inode": 917923, "dev": 51713, "nlink": 1, "atime": 1726772770.3489213, "mtime": 1726772770.8669214, "ctime": 1726772770.8669214, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": true, "xgrp": false, "woth": false, "roth": true, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "mimetype": "text/plain", "charset": "us-ascii", "version": "3852760320", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.43.7 closed. 8570 1726773032.85691: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8570 1726773032.85968: Sending initial data 8570 1726773032.85975: Sent initial data (139 bytes) 8570 1726773032.88667: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpym110y8j /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source <<< 8570 1726773032.89324: stderr chunk (state=3): >>><<< 8570 1726773032.89333: stdout chunk (state=3): >>><<< 8570 1726773032.89355: _low_level_execute_command(): starting 8570 1726773032.89362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/ /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source && sleep 0' 8570 1726773032.92093: stderr chunk (state=2): >>><<< 8570 1726773032.92105: stdout chunk (state=2): >>><<< 8570 1726773032.92122: _low_level_execute_command() done: rc=0, stdout=, stderr= 8570 1726773032.92145: variable 'ansible_module_compression' from source: unknown 8570 1726773032.92196: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8570 1726773032.92219: variable 'ansible_facts' from source: unknown 8570 1726773032.92317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_copy.py 8570 1726773032.92793: Sending initial data 8570 1726773032.92801: Sent initial data (150 bytes) 8570 1726773032.95468: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpvt66dp_h /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_copy.py <<< 8570 1726773032.97320: stderr chunk (state=3): >>><<< 8570 1726773032.97331: stdout chunk (state=3): >>><<< 8570 1726773032.97350: done transferring module to remote 8570 1726773032.97360: _low_level_execute_command(): starting 8570 1726773032.97367: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/ /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_copy.py && sleep 0' 8570 1726773033.01334: stderr chunk (state=2): >>><<< 8570 1726773033.01345: stdout chunk (state=2): >>><<< 8570 1726773033.01369: _low_level_execute_command() done: rc=0, stdout=, stderr= 8570 1726773033.01375: _low_level_execute_command(): starting 8570 1726773033.01380: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/AnsiballZ_copy.py && sleep 0' 8570 1726773033.18815: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source", "_original_basename": "tmpym110y8j", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8570 1726773033.19164: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8570 1726773033.19177: stdout chunk (state=3): >>><<< 8570 1726773033.19190: stderr chunk (state=3): >>><<< 8570 1726773033.19206: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source", "_original_basename": "tmpym110y8j", "follow": false, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8570 1726773033.19241: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source', '_original_basename': 'tmpym110y8j', 'follow': False, 'checksum': '3ef9f23deed2e23d3ef2b88b842fb882313e15ce', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8570 1726773033.19253: _low_level_execute_command(): starting 8570 1726773033.19258: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/ > /dev/null 2>&1 && sleep 0' 8570 1726773033.23893: stderr chunk (state=2): >>><<< 8570 1726773033.23903: stdout chunk (state=2): >>><<< 8570 1726773033.23917: _low_level_execute_command() done: rc=0, stdout=, stderr= 8570 1726773033.23926: handler run complete 8570 1726773033.23958: attempt loop complete, returning result 8570 1726773033.23966: _execute() done 8570 1726773033.23970: dumping result to json 8570 1726773033.23976: done dumping result, returning 8570 1726773033.23984: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set profile_mode to manual [0affffe7-6841-f581-0619-00000000014d] 8570 1726773033.23993: sending task result for task 0affffe7-6841-f581-0619-00000000014d 8570 1726773033.24031: done sending task result for task 0affffe7-6841-f581-0619-00000000014d 8570 1726773033.24034: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "cf3f2a865fbea819dadd439586eaee31", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 7, "src": "/root/.ansible/tmp/ansible-tmp-1726773032.5349584-8570-92547000678801/source", "state": "file", "uid": 0 } 8208 1726773033.24571: no more pending results, returning what we have 8208 1726773033.24574: results queue empty 8208 1726773033.24575: checking for any_errors_fatal 8208 1726773033.24582: done checking for any_errors_fatal 8208 1726773033.24583: checking for max_fail_percentage 8208 1726773033.24584: done checking for max_fail_percentage 8208 1726773033.24586: checking to see if all hosts have failed and the running result is not ok 8208 1726773033.24587: done checking to see if all hosts have failed 8208 1726773033.24587: getting the remaining hosts for this loop 8208 1726773033.24588: done getting the remaining hosts for this loop 8208 1726773033.24591: getting the next task for host managed_node1 8208 1726773033.24596: done getting next task for host managed_node1 8208 1726773033.24600: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get current config 8208 1726773033.24603: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773033.24612: getting variables 8208 1726773033.24613: in VariableManager get_vars() 8208 1726773033.24646: Calling all_inventory to load vars for managed_node1 8208 1726773033.24649: Calling groups_inventory to load vars for managed_node1 8208 1726773033.24650: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773033.24658: Calling all_plugins_play to load vars for managed_node1 8208 1726773033.24660: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773033.24663: Calling groups_plugins_play to load vars for managed_node1 8208 1726773033.24830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773033.25037: done with get_vars() 8208 1726773033.25047: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Get current config] ********** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.771) 0:00:13.925 **** 8208 1726773033.25129: entering _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8208 1726773033.25351: worker is 1 (out of 1 available) 8208 1726773033.25367: exiting _queue_task() for managed_node1/fedora.linux_system_roles.kernel_settings_get_config 8208 1726773033.25379: done queuing things up, now waiting for results queue to drain 8208 1726773033.25381: waiting for pending results... 8617 1726773033.25615: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get current config 8617 1726773033.25756: in run() - task 0affffe7-6841-f581-0619-00000000014e 8617 1726773033.25775: variable 'ansible_search_path' from source: unknown 8617 1726773033.25779: variable 'ansible_search_path' from source: unknown 8617 1726773033.25814: calling self._execute() 8617 1726773033.25879: variable 'ansible_host' from source: host vars for 'managed_node1' 8617 1726773033.25889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8617 1726773033.25897: variable 'omit' from source: magic vars 8617 1726773033.25987: variable 'omit' from source: magic vars 8617 1726773033.26043: variable 'omit' from source: magic vars 8617 1726773033.26071: variable '__kernel_settings_profile_filename' from source: role '' all vars 8617 1726773033.26351: variable '__kernel_settings_profile_filename' from source: role '' all vars 8617 1726773033.26541: variable '__kernel_settings_profile_dir' from source: role '' all vars 8617 1726773033.26614: variable '__kernel_settings_profile_parent' from source: set_fact 8617 1726773033.26623: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8617 1726773033.26657: variable 'omit' from source: magic vars 8617 1726773033.26700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8617 1726773033.26731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8617 1726773033.26748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8617 1726773033.26759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8617 1726773033.26769: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8617 1726773033.26796: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8617 1726773033.26802: variable 'ansible_host' from source: host vars for 'managed_node1' 8617 1726773033.26806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8617 1726773033.26907: Set connection var ansible_shell_executable to /bin/sh 8617 1726773033.26914: Set connection var ansible_connection to ssh 8617 1726773033.26920: Set connection var ansible_module_compression to ZIP_DEFLATED 8617 1726773033.26928: Set connection var ansible_timeout to 10 8617 1726773033.26931: Set connection var ansible_shell_type to sh 8617 1726773033.26938: Set connection var ansible_pipelining to False 8617 1726773033.26958: variable 'ansible_shell_executable' from source: unknown 8617 1726773033.26963: variable 'ansible_connection' from source: unknown 8617 1726773033.26970: variable 'ansible_module_compression' from source: unknown 8617 1726773033.26973: variable 'ansible_shell_type' from source: unknown 8617 1726773033.26976: variable 'ansible_shell_executable' from source: unknown 8617 1726773033.26978: variable 'ansible_host' from source: host vars for 'managed_node1' 8617 1726773033.26981: variable 'ansible_pipelining' from source: unknown 8617 1726773033.26984: variable 'ansible_timeout' from source: unknown 8617 1726773033.26989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8617 1726773033.27148: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8617 1726773033.27160: variable 'omit' from source: magic vars 8617 1726773033.27169: starting attempt loop 8617 1726773033.27173: running the handler 8617 1726773033.27184: _low_level_execute_command(): starting 8617 1726773033.27195: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8617 1726773033.30077: stdout chunk (state=2): >>>/root <<< 8617 1726773033.30191: stderr chunk (state=3): >>><<< 8617 1726773033.30198: stdout chunk (state=3): >>><<< 8617 1726773033.30218: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8617 1726773033.30235: _low_level_execute_command(): starting 8617 1726773033.30246: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877 `" && echo ansible-tmp-1726773033.3022833-8617-227983771448877="` echo /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877 `" ) && sleep 0' 8617 1726773033.33344: stdout chunk (state=2): >>>ansible-tmp-1726773033.3022833-8617-227983771448877=/root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877 <<< 8617 1726773033.33469: stderr chunk (state=3): >>><<< 8617 1726773033.33477: stdout chunk (state=3): >>><<< 8617 1726773033.33494: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.3022833-8617-227983771448877=/root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877 , stderr= 8617 1726773033.33536: variable 'ansible_module_compression' from source: unknown 8617 1726773033.33567: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.kernel_settings_get_config-ZIP_DEFLATED 8617 1726773033.33598: variable 'ansible_facts' from source: unknown 8617 1726773033.33662: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/AnsiballZ_kernel_settings_get_config.py 8617 1726773033.33799: Sending initial data 8617 1726773033.33807: Sent initial data (173 bytes) 8617 1726773033.36380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpfb9fqmta /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/AnsiballZ_kernel_settings_get_config.py <<< 8617 1726773033.37790: stderr chunk (state=3): >>><<< 8617 1726773033.37800: stdout chunk (state=3): >>><<< 8617 1726773033.37822: done transferring module to remote 8617 1726773033.37834: _low_level_execute_command(): starting 8617 1726773033.37840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/ /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8617 1726773033.40278: stderr chunk (state=2): >>><<< 8617 1726773033.40290: stdout chunk (state=2): >>><<< 8617 1726773033.40304: _low_level_execute_command() done: rc=0, stdout=, stderr= 8617 1726773033.40307: _low_level_execute_command(): starting 8617 1726773033.40311: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/AnsiballZ_kernel_settings_get_config.py && sleep 0' 8617 1726773033.55800: stdout chunk (state=2): >>> {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} <<< 8617 1726773033.56799: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8617 1726773033.56848: stderr chunk (state=3): >>><<< 8617 1726773033.56855: stdout chunk (state=3): >>><<< 8617 1726773033.56873: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "data": {}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf"}}} , stderr=Shared connection to 10.31.43.7 closed. 8617 1726773033.56896: done with _execute_module (fedora.linux_system_roles.kernel_settings_get_config, {'path': '/etc/tuned/kernel_settings/tuned.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'fedora.linux_system_roles.kernel_settings_get_config', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8617 1726773033.56907: _low_level_execute_command(): starting 8617 1726773033.56913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.3022833-8617-227983771448877/ > /dev/null 2>&1 && sleep 0' 8617 1726773033.59445: stderr chunk (state=2): >>><<< 8617 1726773033.59455: stdout chunk (state=2): >>><<< 8617 1726773033.59478: _low_level_execute_command() done: rc=0, stdout=, stderr= 8617 1726773033.59490: handler run complete 8617 1726773033.59509: attempt loop complete, returning result 8617 1726773033.59518: _execute() done 8617 1726773033.59522: dumping result to json 8617 1726773033.59526: done dumping result, returning 8617 1726773033.59534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get current config [0affffe7-6841-f581-0619-00000000014e] 8617 1726773033.59544: sending task result for task 0affffe7-6841-f581-0619-00000000014e 8617 1726773033.59576: done sending task result for task 0affffe7-6841-f581-0619-00000000014e 8617 1726773033.59580: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "data": {} } 8208 1726773033.59718: no more pending results, returning what we have 8208 1726773033.59721: results queue empty 8208 1726773033.59722: checking for any_errors_fatal 8208 1726773033.59728: done checking for any_errors_fatal 8208 1726773033.59728: checking for max_fail_percentage 8208 1726773033.59730: done checking for max_fail_percentage 8208 1726773033.59730: checking to see if all hosts have failed and the running result is not ok 8208 1726773033.59731: done checking to see if all hosts have failed 8208 1726773033.59731: getting the remaining hosts for this loop 8208 1726773033.59732: done getting the remaining hosts for this loop 8208 1726773033.59735: getting the next task for host managed_node1 8208 1726773033.59742: done getting next task for host managed_node1 8208 1726773033.59744: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8208 1726773033.59748: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773033.59757: getting variables 8208 1726773033.59758: in VariableManager get_vars() 8208 1726773033.59795: Calling all_inventory to load vars for managed_node1 8208 1726773033.59798: Calling groups_inventory to load vars for managed_node1 8208 1726773033.59799: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773033.59809: Calling all_plugins_play to load vars for managed_node1 8208 1726773033.59811: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773033.59813: Calling groups_plugins_play to load vars for managed_node1 8208 1726773033.59978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773033.60105: done with get_vars() 8208 1726773033.60114: done getting variables 8208 1726773033.60198: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Apply kernel settings] ******* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Thursday 19 September 2024 15:10:33 -0400 (0:00:00.350) 0:00:14.276 **** 8208 1726773033.60224: entering _queue_task() for managed_node1/template 8208 1726773033.60225: Creating lock for template 8208 1726773033.60415: worker is 1 (out of 1 available) 8208 1726773033.60430: exiting _queue_task() for managed_node1/template 8208 1726773033.60442: done queuing things up, now waiting for results queue to drain 8208 1726773033.60444: waiting for pending results... 8631 1726773033.60563: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings 8631 1726773033.60681: in run() - task 0affffe7-6841-f581-0619-00000000014f 8631 1726773033.60697: variable 'ansible_search_path' from source: unknown 8631 1726773033.60701: variable 'ansible_search_path' from source: unknown 8631 1726773033.60728: calling self._execute() 8631 1726773033.60792: variable 'ansible_host' from source: host vars for 'managed_node1' 8631 1726773033.60800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8631 1726773033.60806: variable 'omit' from source: magic vars 8631 1726773033.60879: variable 'omit' from source: magic vars 8631 1726773033.60926: variable 'omit' from source: magic vars 8631 1726773033.61173: variable '__kernel_settings_profile_src' from source: role '' all vars 8631 1726773033.61183: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8631 1726773033.61257: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8631 1726773033.61280: variable '__kernel_settings_profile_filename' from source: role '' all vars 8631 1726773033.61340: variable '__kernel_settings_profile_filename' from source: role '' all vars 8631 1726773033.61431: variable '__kernel_settings_profile_dir' from source: role '' all vars 8631 1726773033.61512: variable '__kernel_settings_profile_parent' from source: set_fact 8631 1726773033.61520: variable '__kernel_settings_tuned_profile' from source: role '' all vars 8631 1726773033.61549: variable 'omit' from source: magic vars 8631 1726773033.61594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8631 1726773033.61629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8631 1726773033.61652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8631 1726773033.61674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8631 1726773033.61689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8631 1726773033.61720: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8631 1726773033.61727: variable 'ansible_host' from source: host vars for 'managed_node1' 8631 1726773033.61731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8631 1726773033.61831: Set connection var ansible_shell_executable to /bin/sh 8631 1726773033.61837: Set connection var ansible_connection to ssh 8631 1726773033.61843: Set connection var ansible_module_compression to ZIP_DEFLATED 8631 1726773033.61850: Set connection var ansible_timeout to 10 8631 1726773033.61852: Set connection var ansible_shell_type to sh 8631 1726773033.61859: Set connection var ansible_pipelining to False 8631 1726773033.61882: variable 'ansible_shell_executable' from source: unknown 8631 1726773033.61890: variable 'ansible_connection' from source: unknown 8631 1726773033.61893: variable 'ansible_module_compression' from source: unknown 8631 1726773033.61896: variable 'ansible_shell_type' from source: unknown 8631 1726773033.61899: variable 'ansible_shell_executable' from source: unknown 8631 1726773033.61901: variable 'ansible_host' from source: host vars for 'managed_node1' 8631 1726773033.61904: variable 'ansible_pipelining' from source: unknown 8631 1726773033.61907: variable 'ansible_timeout' from source: unknown 8631 1726773033.61911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8631 1726773033.62028: Loading ActionModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/template.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8631 1726773033.62040: variable 'omit' from source: magic vars 8631 1726773033.62046: starting attempt loop 8631 1726773033.62050: running the handler 8631 1726773033.62061: _low_level_execute_command(): starting 8631 1726773033.62074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8631 1726773033.64498: stdout chunk (state=2): >>>/root <<< 8631 1726773033.64650: stderr chunk (state=3): >>><<< 8631 1726773033.64658: stdout chunk (state=3): >>><<< 8631 1726773033.64683: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8631 1726773033.64700: _low_level_execute_command(): starting 8631 1726773033.64708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707 `" && echo ansible-tmp-1726773033.6469457-8631-197499360864707="` echo /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707 `" ) && sleep 0' 8631 1726773033.67917: stdout chunk (state=2): >>>ansible-tmp-1726773033.6469457-8631-197499360864707=/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707 <<< 8631 1726773033.68159: stderr chunk (state=3): >>><<< 8631 1726773033.68169: stdout chunk (state=3): >>><<< 8631 1726773033.68186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773033.6469457-8631-197499360864707=/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707 , stderr= 8631 1726773033.68203: evaluation_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks 8631 1726773033.68231: search_path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/templates/kernel_settings.j2 /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/kernel_settings.j2 8631 1726773033.68256: variable 'ansible_search_path' from source: unknown 8631 1726773033.68859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8631 1726773033.70340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8631 1726773033.70412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8631 1726773033.70447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8631 1726773033.70480: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8631 1726773033.70507: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8631 1726773033.70703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8631 1726773033.70724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8631 1726773033.70748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8631 1726773033.70777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8631 1726773033.70791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8631 1726773033.71102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8631 1726773033.71125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8631 1726773033.71148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8631 1726773033.71186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8631 1726773033.71201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8631 1726773033.71564: variable 'ansible_managed' from source: unknown 8631 1726773033.71574: variable '__sections' from source: task vars 8631 1726773033.71662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8631 1726773033.71689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8631 1726773033.71708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8631 1726773033.71733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8631 1726773033.71744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8631 1726773033.71819: variable 'kernel_settings_sysctl' from source: include params 8631 1726773033.71826: variable '__kernel_settings_state_empty' from source: role '' all vars 8631 1726773033.71832: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8631 1726773033.71863: variable '__sysctl_old' from source: task vars 8631 1726773033.71912: variable '__sysctl_old' from source: task vars 8631 1726773033.72052: variable 'kernel_settings_purge' from source: include params 8631 1726773033.72059: variable 'kernel_settings_sysctl' from source: include params 8631 1726773033.72067: variable '__kernel_settings_state_empty' from source: role '' all vars 8631 1726773033.72072: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8631 1726773033.72077: variable '__kernel_settings_profile_contents' from source: set_fact 8631 1726773033.72209: variable 'kernel_settings_sysfs' from source: include params 8631 1726773033.72215: variable '__kernel_settings_state_empty' from source: role '' all vars 8631 1726773033.72221: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8631 1726773033.72234: variable '__sysfs_old' from source: task vars 8631 1726773033.72276: variable '__sysfs_old' from source: task vars 8631 1726773033.72418: variable 'kernel_settings_purge' from source: include params 8631 1726773033.72425: variable 'kernel_settings_sysfs' from source: include params 8631 1726773033.72430: variable '__kernel_settings_state_empty' from source: role '' all vars 8631 1726773033.72435: variable '__kernel_settings_previous_replaced' from source: role '' all vars 8631 1726773033.72440: variable '__kernel_settings_profile_contents' from source: set_fact 8631 1726773033.72454: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 8631 1726773033.72462: variable '__systemd_old' from source: task vars 8631 1726773033.72505: variable '__systemd_old' from source: task vars 8631 1726773033.72635: variable 'kernel_settings_purge' from source: include params 8631 1726773033.72642: variable 'kernel_settings_systemd_cpu_affinity' from source: include params 8631 1726773033.72647: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.72653: variable '__kernel_settings_profile_contents' from source: set_fact 8631 1726773033.72662: variable 'kernel_settings_transparent_hugepages' from source: include params 8631 1726773033.72669: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 8631 1726773033.72673: variable '__trans_huge_old' from source: task vars 8631 1726773033.72714: variable '__trans_huge_old' from source: task vars 8631 1726773033.72847: variable 'kernel_settings_purge' from source: include params 8631 1726773033.72853: variable 'kernel_settings_transparent_hugepages' from source: include params 8631 1726773033.72858: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.72864: variable '__kernel_settings_profile_contents' from source: set_fact 8631 1726773033.72874: variable '__trans_defrag_old' from source: task vars 8631 1726773033.72916: variable '__trans_defrag_old' from source: task vars 8631 1726773033.73045: variable 'kernel_settings_purge' from source: include params 8631 1726773033.73053: variable 'kernel_settings_transparent_hugepages_defrag' from source: include params 8631 1726773033.73058: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.73064: variable '__kernel_settings_profile_contents' from source: set_fact 8631 1726773033.73083: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.73096: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.73102: variable '__kernel_settings_state_absent' from source: role '' all vars 8631 1726773033.74141: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8631 1726773033.74198: variable 'ansible_module_compression' from source: unknown 8631 1726773033.74253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8631 1726773033.74281: variable 'ansible_facts' from source: unknown 8631 1726773033.74379: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_stat.py 8631 1726773033.74851: Sending initial data 8631 1726773033.74858: Sent initial data (151 bytes) 8631 1726773033.77793: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp_liqelwm /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_stat.py <<< 8631 1726773033.79891: stderr chunk (state=3): >>><<< 8631 1726773033.79902: stdout chunk (state=3): >>><<< 8631 1726773033.79925: done transferring module to remote 8631 1726773033.79940: _low_level_execute_command(): starting 8631 1726773033.79947: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/ /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_stat.py && sleep 0' 8631 1726773033.82693: stderr chunk (state=2): >>><<< 8631 1726773033.82706: stdout chunk (state=2): >>><<< 8631 1726773033.82723: _low_level_execute_command() done: rc=0, stdout=, stderr= 8631 1726773033.82728: _low_level_execute_command(): starting 8631 1726773033.82734: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_stat.py && sleep 0' 8631 1726773033.98157: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8631 1726773033.99235: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8631 1726773033.99287: stderr chunk (state=3): >>><<< 8631 1726773033.99294: stdout chunk (state=3): >>><<< 8631 1726773033.99309: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings/tuned.conf", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.43.7 closed. 8631 1726773033.99331: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/kernel_settings/tuned.conf', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8631 1726773033.99423: Sending initial data 8631 1726773033.99431: Sent initial data (159 bytes) 8631 1726773034.02184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp5lpep9p6/kernel_settings.j2 /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source <<< 8631 1726773034.02836: stderr chunk (state=3): >>><<< 8631 1726773034.02845: stdout chunk (state=3): >>><<< 8631 1726773034.02860: _low_level_execute_command(): starting 8631 1726773034.02868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/ /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source && sleep 0' 8631 1726773034.05377: stderr chunk (state=2): >>><<< 8631 1726773034.05390: stdout chunk (state=2): >>><<< 8631 1726773034.05406: _low_level_execute_command() done: rc=0, stdout=, stderr= 8631 1726773034.05426: variable 'ansible_module_compression' from source: unknown 8631 1726773034.05461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8631 1726773034.05485: variable 'ansible_facts' from source: unknown 8631 1726773034.05548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_copy.py 8631 1726773034.05641: Sending initial data 8631 1726773034.05649: Sent initial data (151 bytes) 8631 1726773034.08344: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp8pb28mt0 /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_copy.py <<< 8631 1726773034.10014: stderr chunk (state=3): >>><<< 8631 1726773034.10025: stdout chunk (state=3): >>><<< 8631 1726773034.10048: done transferring module to remote 8631 1726773034.10059: _low_level_execute_command(): starting 8631 1726773034.10064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/ /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_copy.py && sleep 0' 8631 1726773034.13013: stderr chunk (state=2): >>><<< 8631 1726773034.13024: stdout chunk (state=2): >>><<< 8631 1726773034.13042: _low_level_execute_command() done: rc=0, stdout=, stderr= 8631 1726773034.13048: _low_level_execute_command(): starting 8631 1726773034.13054: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/AnsiballZ_copy.py && sleep 0' 8631 1726773034.29692: stdout chunk (state=2): >>> {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8631 1726773034.30833: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8631 1726773034.30843: stdout chunk (state=3): >>><<< 8631 1726773034.30855: stderr chunk (state=3): >>><<< 8631 1726773034.30868: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/kernel_settings/tuned.conf", "src": "/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source", "md5sum": "7d83891795eeb6debeff7e2812501630", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0644", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "invocation": {"module_args": {"src": "/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source", "dest": "/etc/tuned/kernel_settings/tuned.conf", "mode": "0644", "follow": false, "_original_basename": "kernel_settings.j2", "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8631 1726773034.30902: done with _execute_module (ansible.legacy.copy, {'src': '/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source', 'dest': '/etc/tuned/kernel_settings/tuned.conf', 'mode': '0644', 'follow': False, '_original_basename': 'kernel_settings.j2', 'checksum': 'e44ba7fc7046252a1b6772f7347d0e7b9b48a069', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8631 1726773034.30930: _low_level_execute_command(): starting 8631 1726773034.30937: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/ > /dev/null 2>&1 && sleep 0' 8631 1726773034.33636: stderr chunk (state=2): >>><<< 8631 1726773034.33646: stdout chunk (state=2): >>><<< 8631 1726773034.33662: _low_level_execute_command() done: rc=0, stdout=, stderr= 8631 1726773034.33676: handler run complete 8631 1726773034.33704: attempt loop complete, returning result 8631 1726773034.33710: _execute() done 8631 1726773034.33713: dumping result to json 8631 1726773034.33718: done dumping result, returning 8631 1726773034.33725: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Apply kernel settings [0affffe7-6841-f581-0619-00000000014f] 8631 1726773034.33731: sending task result for task 0affffe7-6841-f581-0619-00000000014f 8631 1726773034.33775: done sending task result for task 0affffe7-6841-f581-0619-00000000014f 8631 1726773034.33779: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "e44ba7fc7046252a1b6772f7347d0e7b9b48a069", "dest": "/etc/tuned/kernel_settings/tuned.conf", "gid": 0, "group": "root", "md5sum": "7d83891795eeb6debeff7e2812501630", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 86, "src": "/root/.ansible/tmp/ansible-tmp-1726773033.6469457-8631-197499360864707/source", "state": "file", "uid": 0 } 8208 1726773034.33955: no more pending results, returning what we have 8208 1726773034.33958: results queue empty 8208 1726773034.33959: checking for any_errors_fatal 8208 1726773034.33965: done checking for any_errors_fatal 8208 1726773034.33966: checking for max_fail_percentage 8208 1726773034.33967: done checking for max_fail_percentage 8208 1726773034.33968: checking to see if all hosts have failed and the running result is not ok 8208 1726773034.33968: done checking to see if all hosts have failed 8208 1726773034.33969: getting the remaining hosts for this loop 8208 1726773034.33970: done getting the remaining hosts for this loop 8208 1726773034.33973: getting the next task for host managed_node1 8208 1726773034.33979: done getting next task for host managed_node1 8208 1726773034.33983: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8208 1726773034.33987: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773034.33997: getting variables 8208 1726773034.33998: in VariableManager get_vars() 8208 1726773034.34031: Calling all_inventory to load vars for managed_node1 8208 1726773034.34034: Calling groups_inventory to load vars for managed_node1 8208 1726773034.34036: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773034.34044: Calling all_plugins_play to load vars for managed_node1 8208 1726773034.34046: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773034.34048: Calling groups_plugins_play to load vars for managed_node1 8208 1726773034.34159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773034.34280: done with get_vars() 8208 1726773034.34292: done getting variables 8208 1726773034.34334: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 Thursday 19 September 2024 15:10:34 -0400 (0:00:00.741) 0:00:15.017 **** 8208 1726773034.34359: entering _queue_task() for managed_node1/service 8208 1726773034.34558: worker is 1 (out of 1 available) 8208 1726773034.34574: exiting _queue_task() for managed_node1/service 8208 1726773034.34587: done queuing things up, now waiting for results queue to drain 8208 1726773034.34589: waiting for pending results... 8667 1726773034.34702: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes 8667 1726773034.34821: in run() - task 0affffe7-6841-f581-0619-000000000150 8667 1726773034.34837: variable 'ansible_search_path' from source: unknown 8667 1726773034.34841: variable 'ansible_search_path' from source: unknown 8667 1726773034.34876: variable '__kernel_settings_services' from source: include_vars 8667 1726773034.35169: variable '__kernel_settings_services' from source: include_vars 8667 1726773034.35217: variable 'omit' from source: magic vars 8667 1726773034.35293: variable 'ansible_host' from source: host vars for 'managed_node1' 8667 1726773034.35303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8667 1726773034.35312: variable 'omit' from source: magic vars 8667 1726773034.35492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8667 1726773034.35676: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8667 1726773034.35711: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8667 1726773034.35736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8667 1726773034.35762: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8667 1726773034.35834: variable '__kernel_settings_register_profile' from source: set_fact 8667 1726773034.35844: variable '__kernel_settings_register_mode' from source: set_fact 8667 1726773034.35857: Evaluated conditional (__kernel_settings_register_profile is changed or __kernel_settings_register_mode is changed): True 8667 1726773034.35863: variable 'omit' from source: magic vars 8667 1726773034.35898: variable 'omit' from source: magic vars 8667 1726773034.35927: variable 'item' from source: unknown 8667 1726773034.35974: variable 'item' from source: unknown 8667 1726773034.35991: variable 'omit' from source: magic vars 8667 1726773034.36014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8667 1726773034.36033: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8667 1726773034.36046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8667 1726773034.36058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8667 1726773034.36066: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8667 1726773034.36088: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8667 1726773034.36092: variable 'ansible_host' from source: host vars for 'managed_node1' 8667 1726773034.36094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8667 1726773034.36156: Set connection var ansible_shell_executable to /bin/sh 8667 1726773034.36161: Set connection var ansible_connection to ssh 8667 1726773034.36165: Set connection var ansible_module_compression to ZIP_DEFLATED 8667 1726773034.36170: Set connection var ansible_timeout to 10 8667 1726773034.36172: Set connection var ansible_shell_type to sh 8667 1726773034.36176: Set connection var ansible_pipelining to False 8667 1726773034.36193: variable 'ansible_shell_executable' from source: unknown 8667 1726773034.36197: variable 'ansible_connection' from source: unknown 8667 1726773034.36199: variable 'ansible_module_compression' from source: unknown 8667 1726773034.36202: variable 'ansible_shell_type' from source: unknown 8667 1726773034.36204: variable 'ansible_shell_executable' from source: unknown 8667 1726773034.36206: variable 'ansible_host' from source: host vars for 'managed_node1' 8667 1726773034.36210: variable 'ansible_pipelining' from source: unknown 8667 1726773034.36212: variable 'ansible_timeout' from source: unknown 8667 1726773034.36215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8667 1726773034.36309: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8667 1726773034.36320: variable 'omit' from source: magic vars 8667 1726773034.36325: starting attempt loop 8667 1726773034.36328: running the handler 8667 1726773034.36388: variable 'ansible_facts' from source: unknown 8667 1726773034.36506: _low_level_execute_command(): starting 8667 1726773034.36515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8667 1726773034.39091: stdout chunk (state=2): >>>/root <<< 8667 1726773034.39295: stderr chunk (state=3): >>><<< 8667 1726773034.39303: stdout chunk (state=3): >>><<< 8667 1726773034.39325: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8667 1726773034.39342: _low_level_execute_command(): starting 8667 1726773034.39348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168 `" && echo ansible-tmp-1726773034.3933396-8667-263816202868168="` echo /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168 `" ) && sleep 0' 8667 1726773034.42154: stdout chunk (state=2): >>>ansible-tmp-1726773034.3933396-8667-263816202868168=/root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168 <<< 8667 1726773034.42281: stderr chunk (state=3): >>><<< 8667 1726773034.42291: stdout chunk (state=3): >>><<< 8667 1726773034.42309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773034.3933396-8667-263816202868168=/root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168 , stderr= 8667 1726773034.42337: variable 'ansible_module_compression' from source: unknown 8667 1726773034.42378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8667 1726773034.42431: variable 'ansible_facts' from source: unknown 8667 1726773034.42590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/AnsiballZ_systemd.py 8667 1726773034.42701: Sending initial data 8667 1726773034.42708: Sent initial data (154 bytes) 8667 1726773034.45426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpai2ehjlf /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/AnsiballZ_systemd.py <<< 8667 1726773034.47776: stderr chunk (state=3): >>><<< 8667 1726773034.47789: stdout chunk (state=3): >>><<< 8667 1726773034.47813: done transferring module to remote 8667 1726773034.47825: _low_level_execute_command(): starting 8667 1726773034.47831: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/ /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/AnsiballZ_systemd.py && sleep 0' 8667 1726773034.50372: stderr chunk (state=2): >>><<< 8667 1726773034.50381: stdout chunk (state=2): >>><<< 8667 1726773034.50397: _low_level_execute_command() done: rc=0, stdout=, stderr= 8667 1726773034.50402: _low_level_execute_command(): starting 8667 1726773034.50408: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/AnsiballZ_systemd.py && sleep 0' 8667 1726773035.07103: stdout chunk (state=2): >>> {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18620416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8667 1726773035.09106: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8667 1726773035.09119: stdout chunk (state=3): >>><<< 8667 1726773035.09130: stderr chunk (state=3): >>><<< 8667 1726773035.09151: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": true, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "664", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "18620416", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "restarted", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8667 1726773035.09315: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'restarted', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8667 1726773035.09338: _low_level_execute_command(): starting 8667 1726773035.09344: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773034.3933396-8667-263816202868168/ > /dev/null 2>&1 && sleep 0' 8667 1726773035.11954: stderr chunk (state=2): >>><<< 8667 1726773035.11965: stdout chunk (state=2): >>><<< 8667 1726773035.11988: _low_level_execute_command() done: rc=0, stdout=, stderr= 8667 1726773035.11999: handler run complete 8667 1726773035.12050: attempt loop complete, returning result 8667 1726773035.12070: variable 'item' from source: unknown 8667 1726773035.12154: variable 'item' from source: unknown changed: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": true, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:06:10 EDT", "ActiveEnterTimestampMonotonic": "24900735", "ActiveExitTimestampMonotonic": "0", "ActiveState": "active", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:06:09 EDT", "AssertTimestampMonotonic": "23762468", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ConditionTimestampMonotonic": "23762466", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "664", "ExecMainStartTimestamp": "Thu 2024-09-19 15:06:09 EDT", "ExecMainStartTimestampMonotonic": "23763308", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:06:09 EDT] ; stop_time=[n/a] ; pid=664 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestamp": "Thu 2024-09-19 15:06:09 EDT", "InactiveExitTimestampMonotonic": "23763483", "InvocationID": "883d3d1b58be437785da31f48ec3b86d", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "664", "MemoryAccounting": "yes", "MemoryCurrent": "18620416", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:06:10 EDT", "StateChangeTimestampMonotonic": "24900735", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:06:10 EDT", "WatchdogTimestampMonotonic": "24900732", "WatchdogUSec": "0" } } 8667 1726773035.12301: dumping result to json 8667 1726773035.12319: done dumping result, returning 8667 1726773035.12334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes [0affffe7-6841-f581-0619-000000000150] 8667 1726773035.12343: sending task result for task 0affffe7-6841-f581-0619-000000000150 8667 1726773035.12458: done sending task result for task 0affffe7-6841-f581-0619-000000000150 8667 1726773035.12462: WORKER PROCESS EXITING 8208 1726773035.13178: no more pending results, returning what we have 8208 1726773035.13182: results queue empty 8208 1726773035.13182: checking for any_errors_fatal 8208 1726773035.13190: done checking for any_errors_fatal 8208 1726773035.13191: checking for max_fail_percentage 8208 1726773035.13193: done checking for max_fail_percentage 8208 1726773035.13193: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.13194: done checking to see if all hosts have failed 8208 1726773035.13194: getting the remaining hosts for this loop 8208 1726773035.13195: done getting the remaining hosts for this loop 8208 1726773035.13198: getting the next task for host managed_node1 8208 1726773035.13204: done getting next task for host managed_node1 8208 1726773035.13207: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8208 1726773035.13209: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.13220: getting variables 8208 1726773035.13222: in VariableManager get_vars() 8208 1726773035.13249: Calling all_inventory to load vars for managed_node1 8208 1726773035.13252: Calling groups_inventory to load vars for managed_node1 8208 1726773035.13254: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.13264: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.13270: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.13274: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.13432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.13656: done with get_vars() 8208 1726773035.13670: done getting variables 8208 1726773035.13729: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Tuned apply settings] ******** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:157 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.793) 0:00:15.811 **** 8208 1726773035.13752: entering _queue_task() for managed_node1/command 8208 1726773035.13941: worker is 1 (out of 1 available) 8208 1726773035.13956: exiting _queue_task() for managed_node1/command 8208 1726773035.13970: done queuing things up, now waiting for results queue to drain 8208 1726773035.13972: waiting for pending results... 8692 1726773035.14109: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings 8692 1726773035.14231: in run() - task 0affffe7-6841-f581-0619-000000000151 8692 1726773035.14248: variable 'ansible_search_path' from source: unknown 8692 1726773035.14253: variable 'ansible_search_path' from source: unknown 8692 1726773035.14282: calling self._execute() 8692 1726773035.14347: variable 'ansible_host' from source: host vars for 'managed_node1' 8692 1726773035.14356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8692 1726773035.14365: variable 'omit' from source: magic vars 8692 1726773035.14703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8692 1726773035.14882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8692 1726773035.14919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8692 1726773035.14944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8692 1726773035.14978: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8692 1726773035.15064: variable '__kernel_settings_register_profile' from source: set_fact 8692 1726773035.15086: Evaluated conditional (not __kernel_settings_register_profile is changed): False 8692 1726773035.15092: when evaluation is False, skipping this task 8692 1726773035.15096: _execute() done 8692 1726773035.15100: dumping result to json 8692 1726773035.15104: done dumping result, returning 8692 1726773035.15110: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Tuned apply settings [0affffe7-6841-f581-0619-000000000151] 8692 1726773035.15117: sending task result for task 0affffe7-6841-f581-0619-000000000151 8692 1726773035.15139: done sending task result for task 0affffe7-6841-f581-0619-000000000151 8692 1726773035.15142: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __kernel_settings_register_profile is changed", "skip_reason": "Conditional result was False" } 8208 1726773035.15254: no more pending results, returning what we have 8208 1726773035.15257: results queue empty 8208 1726773035.15257: checking for any_errors_fatal 8208 1726773035.15273: done checking for any_errors_fatal 8208 1726773035.15274: checking for max_fail_percentage 8208 1726773035.15276: done checking for max_fail_percentage 8208 1726773035.15276: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.15277: done checking to see if all hosts have failed 8208 1726773035.15277: getting the remaining hosts for this loop 8208 1726773035.15279: done getting the remaining hosts for this loop 8208 1726773035.15281: getting the next task for host managed_node1 8208 1726773035.15289: done getting next task for host managed_node1 8208 1726773035.15293: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8208 1726773035.15297: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.15310: getting variables 8208 1726773035.15311: in VariableManager get_vars() 8208 1726773035.15345: Calling all_inventory to load vars for managed_node1 8208 1726773035.15347: Calling groups_inventory to load vars for managed_node1 8208 1726773035.15349: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.15357: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.15359: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.15361: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.15467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.15587: done with get_vars() 8208 1726773035.15596: done getting variables TASK [fedora.linux_system_roles.kernel_settings : Verify settings] ************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:166 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.019) 0:00:15.830 **** 8208 1726773035.15661: entering _queue_task() for managed_node1/include_tasks 8208 1726773035.15836: worker is 1 (out of 1 available) 8208 1726773035.15851: exiting _queue_task() for managed_node1/include_tasks 8208 1726773035.15861: done queuing things up, now waiting for results queue to drain 8208 1726773035.15863: waiting for pending results... 8693 1726773035.15991: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Verify settings 8693 1726773035.16122: in run() - task 0affffe7-6841-f581-0619-000000000152 8693 1726773035.16138: variable 'ansible_search_path' from source: unknown 8693 1726773035.16142: variable 'ansible_search_path' from source: unknown 8693 1726773035.16172: calling self._execute() 8693 1726773035.16235: variable 'ansible_host' from source: host vars for 'managed_node1' 8693 1726773035.16243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8693 1726773035.16252: variable 'omit' from source: magic vars 8693 1726773035.16626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8693 1726773035.16811: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8693 1726773035.16940: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8693 1726773035.16975: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8693 1726773035.17007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8693 1726773035.17111: variable '__kernel_settings_register_apply' from source: set_fact 8693 1726773035.17130: Evaluated conditional (__kernel_settings_register_apply is changed): True 8693 1726773035.17138: _execute() done 8693 1726773035.17142: dumping result to json 8693 1726773035.17145: done dumping result, returning 8693 1726773035.17150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Verify settings [0affffe7-6841-f581-0619-000000000152] 8693 1726773035.17156: sending task result for task 0affffe7-6841-f581-0619-000000000152 8693 1726773035.17187: done sending task result for task 0affffe7-6841-f581-0619-000000000152 8693 1726773035.17191: WORKER PROCESS EXITING 8208 1726773035.17537: no more pending results, returning what we have 8208 1726773035.17540: in VariableManager get_vars() 8208 1726773035.17571: Calling all_inventory to load vars for managed_node1 8208 1726773035.17573: Calling groups_inventory to load vars for managed_node1 8208 1726773035.17574: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.17581: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.17582: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.17588: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.17697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.17848: done with get_vars() 8208 1726773035.17853: variable 'ansible_search_path' from source: unknown 8208 1726773035.17853: variable 'ansible_search_path' from source: unknown 8208 1726773035.17877: we have included files to process 8208 1726773035.17878: generating all_blocks data 8208 1726773035.17881: done generating all_blocks data 8208 1726773035.17887: processing included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8208 1726773035.17888: loading included file: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml 8208 1726773035.17890: Loading data from /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml included: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml for managed_node1 8208 1726773035.18201: done processing included file 8208 1726773035.18204: iterating over new_blocks loaded from include file 8208 1726773035.18204: in VariableManager get_vars() 8208 1726773035.18221: done with get_vars() 8208 1726773035.18222: filtering new block on tags 8208 1726773035.18238: done filtering new block on tags 8208 1726773035.18239: done iterating over new_blocks loaded from include file 8208 1726773035.18240: extending task lists for all hosts with included blocks 8208 1726773035.18794: done extending task lists 8208 1726773035.18795: done processing included files 8208 1726773035.18796: results queue empty 8208 1726773035.18796: checking for any_errors_fatal 8208 1726773035.18798: done checking for any_errors_fatal 8208 1726773035.18799: checking for max_fail_percentage 8208 1726773035.18799: done checking for max_fail_percentage 8208 1726773035.18800: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.18800: done checking to see if all hosts have failed 8208 1726773035.18800: getting the remaining hosts for this loop 8208 1726773035.18801: done getting the remaining hosts for this loop 8208 1726773035.18803: getting the next task for host managed_node1 8208 1726773035.18806: done getting next task for host managed_node1 8208 1726773035.18808: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8208 1726773035.18810: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.18817: getting variables 8208 1726773035.18818: in VariableManager get_vars() 8208 1726773035.18827: Calling all_inventory to load vars for managed_node1 8208 1726773035.18829: Calling groups_inventory to load vars for managed_node1 8208 1726773035.18830: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.18833: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.18834: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.18836: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.18940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.19054: done with get_vars() 8208 1726773035.19060: done getting variables 8208 1726773035.19091: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.034) 0:00:15.865 **** 8208 1726773035.19119: entering _queue_task() for managed_node1/command 8208 1726773035.19311: worker is 1 (out of 1 available) 8208 1726773035.19325: exiting _queue_task() for managed_node1/command 8208 1726773035.19337: done queuing things up, now waiting for results queue to drain 8208 1726773035.19339: waiting for pending results... 8695 1726773035.19457: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly 8695 1726773035.19590: in run() - task 0affffe7-6841-f581-0619-000000000231 8695 1726773035.19606: variable 'ansible_search_path' from source: unknown 8695 1726773035.19611: variable 'ansible_search_path' from source: unknown 8695 1726773035.19640: calling self._execute() 8695 1726773035.19701: variable 'ansible_host' from source: host vars for 'managed_node1' 8695 1726773035.19710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8695 1726773035.19719: variable 'omit' from source: magic vars 8695 1726773035.19794: variable 'omit' from source: magic vars 8695 1726773035.19839: variable 'omit' from source: magic vars 8695 1726773035.19860: variable 'omit' from source: magic vars 8695 1726773035.19898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8695 1726773035.19926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8695 1726773035.19946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8695 1726773035.19962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8695 1726773035.19973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8695 1726773035.20000: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8695 1726773035.20006: variable 'ansible_host' from source: host vars for 'managed_node1' 8695 1726773035.20010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8695 1726773035.20079: Set connection var ansible_shell_executable to /bin/sh 8695 1726773035.20084: Set connection var ansible_connection to ssh 8695 1726773035.20090: Set connection var ansible_module_compression to ZIP_DEFLATED 8695 1726773035.20095: Set connection var ansible_timeout to 10 8695 1726773035.20097: Set connection var ansible_shell_type to sh 8695 1726773035.20101: Set connection var ansible_pipelining to False 8695 1726773035.20117: variable 'ansible_shell_executable' from source: unknown 8695 1726773035.20121: variable 'ansible_connection' from source: unknown 8695 1726773035.20124: variable 'ansible_module_compression' from source: unknown 8695 1726773035.20125: variable 'ansible_shell_type' from source: unknown 8695 1726773035.20127: variable 'ansible_shell_executable' from source: unknown 8695 1726773035.20129: variable 'ansible_host' from source: host vars for 'managed_node1' 8695 1726773035.20131: variable 'ansible_pipelining' from source: unknown 8695 1726773035.20132: variable 'ansible_timeout' from source: unknown 8695 1726773035.20135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8695 1726773035.20226: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8695 1726773035.20238: variable 'omit' from source: magic vars 8695 1726773035.20241: starting attempt loop 8695 1726773035.20243: running the handler 8695 1726773035.20254: _low_level_execute_command(): starting 8695 1726773035.20259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8695 1726773035.22744: stdout chunk (state=2): >>>/root <<< 8695 1726773035.22861: stderr chunk (state=3): >>><<< 8695 1726773035.22869: stdout chunk (state=3): >>><<< 8695 1726773035.22891: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8695 1726773035.22906: _low_level_execute_command(): starting 8695 1726773035.22915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386 `" && echo ansible-tmp-1726773035.2290099-8695-187754071289386="` echo /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386 `" ) && sleep 0' 8695 1726773035.25478: stdout chunk (state=2): >>>ansible-tmp-1726773035.2290099-8695-187754071289386=/root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386 <<< 8695 1726773035.25648: stderr chunk (state=3): >>><<< 8695 1726773035.25657: stdout chunk (state=3): >>><<< 8695 1726773035.25679: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773035.2290099-8695-187754071289386=/root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386 , stderr= 8695 1726773035.25714: variable 'ansible_module_compression' from source: unknown 8695 1726773035.25774: ANSIBALLZ: Using generic lock for ansible.legacy.command 8695 1726773035.25779: ANSIBALLZ: Acquiring lock 8695 1726773035.25783: ANSIBALLZ: Lock acquired: 139627423671568 8695 1726773035.25790: ANSIBALLZ: Creating module 8695 1726773035.36182: ANSIBALLZ: Writing module into payload 8695 1726773035.36263: ANSIBALLZ: Writing module 8695 1726773035.36286: ANSIBALLZ: Renaming module 8695 1726773035.36294: ANSIBALLZ: Done creating module 8695 1726773035.36309: variable 'ansible_facts' from source: unknown 8695 1726773035.36371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/AnsiballZ_command.py 8695 1726773035.36478: Sending initial data 8695 1726773035.36487: Sent initial data (154 bytes) 8695 1726773035.39291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpd4rvises /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/AnsiballZ_command.py <<< 8695 1726773035.40913: stderr chunk (state=3): >>><<< 8695 1726773035.40925: stdout chunk (state=3): >>><<< 8695 1726773035.40945: done transferring module to remote 8695 1726773035.40956: _low_level_execute_command(): starting 8695 1726773035.40961: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/ /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/AnsiballZ_command.py && sleep 0' 8695 1726773035.43429: stderr chunk (state=2): >>><<< 8695 1726773035.43439: stdout chunk (state=2): >>><<< 8695 1726773035.43456: _low_level_execute_command() done: rc=0, stdout=, stderr= 8695 1726773035.43461: _low_level_execute_command(): starting 8695 1726773035.43469: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/AnsiballZ_command.py && sleep 0' 8695 1726773035.70487: stdout chunk (state=2): >>> {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:35.582444", "end": "2024-09-19 15:10:35.702773", "delta": "0:00:00.120329", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8695 1726773035.71659: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8695 1726773035.71711: stderr chunk (state=3): >>><<< 8695 1726773035.71718: stdout chunk (state=3): >>><<< 8695 1726773035.71737: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "Verification succeeded, current system settings match the preset profile.\nSee TuneD log file ('/var/log/tuned/tuned.log') for details.", "stderr": "", "rc": 0, "cmd": ["tuned-adm", "verify", "-i"], "start": "2024-09-19 15:10:35.582444", "end": "2024-09-19 15:10:35.702773", "delta": "0:00:00.120329", "msg": "", "invocation": {"module_args": {"_raw_params": "tuned-adm verify -i", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8695 1726773035.71771: done with _execute_module (ansible.legacy.command, {'_raw_params': 'tuned-adm verify -i', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8695 1726773035.71780: _low_level_execute_command(): starting 8695 1726773035.71788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773035.2290099-8695-187754071289386/ > /dev/null 2>&1 && sleep 0' 8695 1726773035.74361: stderr chunk (state=2): >>><<< 8695 1726773035.74375: stdout chunk (state=2): >>><<< 8695 1726773035.74393: _low_level_execute_command() done: rc=0, stdout=, stderr= 8695 1726773035.74402: handler run complete 8695 1726773035.74420: Evaluated conditional (False): False 8695 1726773035.74430: attempt loop complete, returning result 8695 1726773035.74433: _execute() done 8695 1726773035.74439: dumping result to json 8695 1726773035.74445: done dumping result, returning 8695 1726773035.74457: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly [0affffe7-6841-f581-0619-000000000231] 8695 1726773035.74467: sending task result for task 0affffe7-6841-f581-0619-000000000231 8695 1726773035.74500: done sending task result for task 0affffe7-6841-f581-0619-000000000231 8695 1726773035.74504: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "tuned-adm", "verify", "-i" ], "delta": "0:00:00.120329", "end": "2024-09-19 15:10:35.702773", "rc": 0, "start": "2024-09-19 15:10:35.582444" } STDOUT: Verification succeeded, current system settings match the preset profile. See TuneD log file ('/var/log/tuned/tuned.log') for details. 8208 1726773035.74662: no more pending results, returning what we have 8208 1726773035.74665: results queue empty 8208 1726773035.74666: checking for any_errors_fatal 8208 1726773035.74667: done checking for any_errors_fatal 8208 1726773035.74668: checking for max_fail_percentage 8208 1726773035.74669: done checking for max_fail_percentage 8208 1726773035.74670: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.74671: done checking to see if all hosts have failed 8208 1726773035.74671: getting the remaining hosts for this loop 8208 1726773035.74672: done getting the remaining hosts for this loop 8208 1726773035.74676: getting the next task for host managed_node1 8208 1726773035.74683: done getting next task for host managed_node1 8208 1726773035.74688: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8208 1726773035.74692: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.74701: getting variables 8208 1726773035.74702: in VariableManager get_vars() 8208 1726773035.74739: Calling all_inventory to load vars for managed_node1 8208 1726773035.74742: Calling groups_inventory to load vars for managed_node1 8208 1726773035.74744: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.74752: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.74754: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.74757: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.74901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.75044: done with get_vars() 8208 1726773035.75058: done getting variables 8208 1726773035.75146: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Get last verify results from log] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:12 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.560) 0:00:16.426 **** 8208 1726773035.75181: entering _queue_task() for managed_node1/shell 8208 1726773035.75182: Creating lock for shell 8208 1726773035.75420: worker is 1 (out of 1 available) 8208 1726773035.75434: exiting _queue_task() for managed_node1/shell 8208 1726773035.75445: done queuing things up, now waiting for results queue to drain 8208 1726773035.75447: waiting for pending results... 8721 1726773035.75705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log 8721 1726773035.75876: in run() - task 0affffe7-6841-f581-0619-000000000232 8721 1726773035.75895: variable 'ansible_search_path' from source: unknown 8721 1726773035.75901: variable 'ansible_search_path' from source: unknown 8721 1726773035.75930: calling self._execute() 8721 1726773035.75989: variable 'ansible_host' from source: host vars for 'managed_node1' 8721 1726773035.75999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8721 1726773035.76007: variable 'omit' from source: magic vars 8721 1726773035.76778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8721 1726773035.76971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8721 1726773035.77013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8721 1726773035.77060: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8721 1726773035.77094: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8721 1726773035.77180: variable '__kernel_settings_register_verify_values' from source: set_fact 8721 1726773035.77201: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8721 1726773035.77210: when evaluation is False, skipping this task 8721 1726773035.77214: _execute() done 8721 1726773035.77216: dumping result to json 8721 1726773035.77218: done dumping result, returning 8721 1726773035.77223: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Get last verify results from log [0affffe7-6841-f581-0619-000000000232] 8721 1726773035.77227: sending task result for task 0affffe7-6841-f581-0619-000000000232 8721 1726773035.77246: done sending task result for task 0affffe7-6841-f581-0619-000000000232 8721 1726773035.77248: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8208 1726773035.77459: no more pending results, returning what we have 8208 1726773035.77462: results queue empty 8208 1726773035.77463: checking for any_errors_fatal 8208 1726773035.77473: done checking for any_errors_fatal 8208 1726773035.77473: checking for max_fail_percentage 8208 1726773035.77475: done checking for max_fail_percentage 8208 1726773035.77475: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.77477: done checking to see if all hosts have failed 8208 1726773035.77478: getting the remaining hosts for this loop 8208 1726773035.77479: done getting the remaining hosts for this loop 8208 1726773035.77483: getting the next task for host managed_node1 8208 1726773035.77491: done getting next task for host managed_node1 8208 1726773035.77494: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8208 1726773035.77498: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.77510: getting variables 8208 1726773035.77511: in VariableManager get_vars() 8208 1726773035.77546: Calling all_inventory to load vars for managed_node1 8208 1726773035.77549: Calling groups_inventory to load vars for managed_node1 8208 1726773035.77550: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.77557: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.77559: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.77560: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.77918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.78043: done with get_vars() 8208 1726773035.78051: done getting variables 8208 1726773035.78107: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:23 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.029) 0:00:16.455 **** 8208 1726773035.78137: entering _queue_task() for managed_node1/fail 8208 1726773035.78338: worker is 1 (out of 1 available) 8208 1726773035.78353: exiting _queue_task() for managed_node1/fail 8208 1726773035.78363: done queuing things up, now waiting for results queue to drain 8208 1726773035.78368: waiting for pending results... 8724 1726773035.78505: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors 8724 1726773035.78647: in run() - task 0affffe7-6841-f581-0619-000000000233 8724 1726773035.78665: variable 'ansible_search_path' from source: unknown 8724 1726773035.78670: variable 'ansible_search_path' from source: unknown 8724 1726773035.78699: calling self._execute() 8724 1726773035.78772: variable 'ansible_host' from source: host vars for 'managed_node1' 8724 1726773035.78781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8724 1726773035.78791: variable 'omit' from source: magic vars 8724 1726773035.79200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8724 1726773035.79490: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8724 1726773035.79539: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8724 1726773035.79576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8724 1726773035.79615: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8724 1726773035.79732: variable '__kernel_settings_register_verify_values' from source: set_fact 8724 1726773035.79753: Evaluated conditional (__kernel_settings_register_verify_values is failed): False 8724 1726773035.79758: when evaluation is False, skipping this task 8724 1726773035.79762: _execute() done 8724 1726773035.79764: dumping result to json 8724 1726773035.79768: done dumping result, returning 8724 1726773035.79777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Report errors that are not bootloader errors [0affffe7-6841-f581-0619-000000000233] 8724 1726773035.79784: sending task result for task 0affffe7-6841-f581-0619-000000000233 8724 1726773035.79817: done sending task result for task 0affffe7-6841-f581-0619-000000000233 8724 1726773035.79820: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__kernel_settings_register_verify_values is failed", "skip_reason": "Conditional result was False" } 8208 1726773035.80235: no more pending results, returning what we have 8208 1726773035.80238: results queue empty 8208 1726773035.80238: checking for any_errors_fatal 8208 1726773035.80243: done checking for any_errors_fatal 8208 1726773035.80243: checking for max_fail_percentage 8208 1726773035.80244: done checking for max_fail_percentage 8208 1726773035.80244: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.80245: done checking to see if all hosts have failed 8208 1726773035.80245: getting the remaining hosts for this loop 8208 1726773035.80246: done getting the remaining hosts for this loop 8208 1726773035.80249: getting the next task for host managed_node1 8208 1726773035.80255: done getting next task for host managed_node1 8208 1726773035.80258: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8208 1726773035.80261: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.80278: getting variables 8208 1726773035.80280: in VariableManager get_vars() 8208 1726773035.80320: Calling all_inventory to load vars for managed_node1 8208 1726773035.80323: Calling groups_inventory to load vars for managed_node1 8208 1726773035.80325: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.80334: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.80337: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.80339: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.80492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.80649: done with get_vars() 8208 1726773035.80659: done getting variables 8208 1726773035.80711: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:177 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.026) 0:00:16.481 **** 8208 1726773035.80749: entering _queue_task() for managed_node1/set_fact 8208 1726773035.80979: worker is 1 (out of 1 available) 8208 1726773035.80997: exiting _queue_task() for managed_node1/set_fact 8208 1726773035.81009: done queuing things up, now waiting for results queue to drain 8208 1726773035.81013: waiting for pending results... 8725 1726773035.81155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes 8725 1726773035.81295: in run() - task 0affffe7-6841-f581-0619-000000000153 8725 1726773035.81318: variable 'ansible_search_path' from source: unknown 8725 1726773035.81326: variable 'ansible_search_path' from source: unknown 8725 1726773035.81362: calling self._execute() 8725 1726773035.81457: variable 'ansible_host' from source: host vars for 'managed_node1' 8725 1726773035.81464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8725 1726773035.81471: variable 'omit' from source: magic vars 8725 1726773035.81555: variable 'omit' from source: magic vars 8725 1726773035.81598: variable 'omit' from source: magic vars 8725 1726773035.81620: variable 'omit' from source: magic vars 8725 1726773035.81657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8725 1726773035.81705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8725 1726773035.81730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8725 1726773035.81749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8725 1726773035.81760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8725 1726773035.81789: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8725 1726773035.81797: variable 'ansible_host' from source: host vars for 'managed_node1' 8725 1726773035.81801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8725 1726773035.81872: Set connection var ansible_shell_executable to /bin/sh 8725 1726773035.81876: Set connection var ansible_connection to ssh 8725 1726773035.81880: Set connection var ansible_module_compression to ZIP_DEFLATED 8725 1726773035.81890: Set connection var ansible_timeout to 10 8725 1726773035.81894: Set connection var ansible_shell_type to sh 8725 1726773035.81899: Set connection var ansible_pipelining to False 8725 1726773035.81915: variable 'ansible_shell_executable' from source: unknown 8725 1726773035.81917: variable 'ansible_connection' from source: unknown 8725 1726773035.81919: variable 'ansible_module_compression' from source: unknown 8725 1726773035.81921: variable 'ansible_shell_type' from source: unknown 8725 1726773035.81923: variable 'ansible_shell_executable' from source: unknown 8725 1726773035.81924: variable 'ansible_host' from source: host vars for 'managed_node1' 8725 1726773035.81926: variable 'ansible_pipelining' from source: unknown 8725 1726773035.81928: variable 'ansible_timeout' from source: unknown 8725 1726773035.81930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8725 1726773035.82206: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8725 1726773035.82224: variable 'omit' from source: magic vars 8725 1726773035.82231: starting attempt loop 8725 1726773035.82235: running the handler 8725 1726773035.82246: handler run complete 8725 1726773035.82257: attempt loop complete, returning result 8725 1726773035.82260: _execute() done 8725 1726773035.82263: dumping result to json 8725 1726773035.82266: done dumping result, returning 8725 1726773035.82272: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set the flag that reboot is needed to apply changes [0affffe7-6841-f581-0619-000000000153] 8725 1726773035.82278: sending task result for task 0affffe7-6841-f581-0619-000000000153 8725 1726773035.82307: done sending task result for task 0affffe7-6841-f581-0619-000000000153 8725 1726773035.82311: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "kernel_settings_reboot_required": false }, "changed": false } 8208 1726773035.82779: no more pending results, returning what we have 8208 1726773035.82781: results queue empty 8208 1726773035.82782: checking for any_errors_fatal 8208 1726773035.82788: done checking for any_errors_fatal 8208 1726773035.82789: checking for max_fail_percentage 8208 1726773035.82790: done checking for max_fail_percentage 8208 1726773035.82790: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.82791: done checking to see if all hosts have failed 8208 1726773035.82791: getting the remaining hosts for this loop 8208 1726773035.82792: done getting the remaining hosts for this loop 8208 1726773035.82797: getting the next task for host managed_node1 8208 1726773035.82805: done getting next task for host managed_node1 8208 1726773035.82809: ^ task is: TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8208 1726773035.82813: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.82825: getting variables 8208 1726773035.82826: in VariableManager get_vars() 8208 1726773035.82856: Calling all_inventory to load vars for managed_node1 8208 1726773035.82858: Calling groups_inventory to load vars for managed_node1 8208 1726773035.82859: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.82868: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.82869: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.82871: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.83031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.83156: done with get_vars() 8208 1726773035.83164: done getting variables 8208 1726773035.83210: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing] *** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:181 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.024) 0:00:16.506 **** 8208 1726773035.83233: entering _queue_task() for managed_node1/set_fact 8208 1726773035.83418: worker is 1 (out of 1 available) 8208 1726773035.83432: exiting _queue_task() for managed_node1/set_fact 8208 1726773035.83444: done queuing things up, now waiting for results queue to drain 8208 1726773035.83446: waiting for pending results... 8727 1726773035.83572: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing 8727 1726773035.83691: in run() - task 0affffe7-6841-f581-0619-000000000154 8727 1726773035.83706: variable 'ansible_search_path' from source: unknown 8727 1726773035.83709: variable 'ansible_search_path' from source: unknown 8727 1726773035.83734: calling self._execute() 8727 1726773035.83798: variable 'ansible_host' from source: host vars for 'managed_node1' 8727 1726773035.83808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8727 1726773035.83816: variable 'omit' from source: magic vars 8727 1726773035.83891: variable 'omit' from source: magic vars 8727 1726773035.83928: variable 'omit' from source: magic vars 8727 1726773035.84193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 8727 1726773035.84378: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 8727 1726773035.84414: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 8727 1726773035.84442: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 8727 1726773035.84471: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 8727 1726773035.84576: variable '__kernel_settings_register_profile' from source: set_fact 8727 1726773035.84590: variable '__kernel_settings_register_mode' from source: set_fact 8727 1726773035.84598: variable '__kernel_settings_register_apply' from source: set_fact 8727 1726773035.84633: variable 'omit' from source: magic vars 8727 1726773035.84656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8727 1726773035.84680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8727 1726773035.84698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8727 1726773035.84750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8727 1726773035.84762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8727 1726773035.84839: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8727 1726773035.84845: variable 'ansible_host' from source: host vars for 'managed_node1' 8727 1726773035.84849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8727 1726773035.84948: Set connection var ansible_shell_executable to /bin/sh 8727 1726773035.84955: Set connection var ansible_connection to ssh 8727 1726773035.84961: Set connection var ansible_module_compression to ZIP_DEFLATED 8727 1726773035.84968: Set connection var ansible_timeout to 10 8727 1726773035.84972: Set connection var ansible_shell_type to sh 8727 1726773035.84979: Set connection var ansible_pipelining to False 8727 1726773035.84999: variable 'ansible_shell_executable' from source: unknown 8727 1726773035.85002: variable 'ansible_connection' from source: unknown 8727 1726773035.85004: variable 'ansible_module_compression' from source: unknown 8727 1726773035.85006: variable 'ansible_shell_type' from source: unknown 8727 1726773035.85007: variable 'ansible_shell_executable' from source: unknown 8727 1726773035.85009: variable 'ansible_host' from source: host vars for 'managed_node1' 8727 1726773035.85011: variable 'ansible_pipelining' from source: unknown 8727 1726773035.85013: variable 'ansible_timeout' from source: unknown 8727 1726773035.85015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8727 1726773035.85093: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8727 1726773035.85104: variable 'omit' from source: magic vars 8727 1726773035.85109: starting attempt loop 8727 1726773035.85111: running the handler 8727 1726773035.85118: handler run complete 8727 1726773035.85123: attempt loop complete, returning result 8727 1726773035.85125: _execute() done 8727 1726773035.85127: dumping result to json 8727 1726773035.85129: done dumping result, returning 8727 1726773035.85133: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.kernel_settings : Set flag to indicate changed for testing [0affffe7-6841-f581-0619-000000000154] 8727 1726773035.85138: sending task result for task 0affffe7-6841-f581-0619-000000000154 8727 1726773035.85162: done sending task result for task 0affffe7-6841-f581-0619-000000000154 8727 1726773035.85164: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__kernel_settings_changed": true }, "changed": false } 8208 1726773035.85471: no more pending results, returning what we have 8208 1726773035.85475: results queue empty 8208 1726773035.85475: checking for any_errors_fatal 8208 1726773035.85481: done checking for any_errors_fatal 8208 1726773035.85481: checking for max_fail_percentage 8208 1726773035.85484: done checking for max_fail_percentage 8208 1726773035.85486: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.85487: done checking to see if all hosts have failed 8208 1726773035.85488: getting the remaining hosts for this loop 8208 1726773035.85489: done getting the remaining hosts for this loop 8208 1726773035.85492: getting the next task for host managed_node1 8208 1726773035.85501: done getting next task for host managed_node1 8208 1726773035.85503: ^ task is: TASK: meta (role_complete) 8208 1726773035.85507: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.85518: getting variables 8208 1726773035.85520: in VariableManager get_vars() 8208 1726773035.85555: Calling all_inventory to load vars for managed_node1 8208 1726773035.85558: Calling groups_inventory to load vars for managed_node1 8208 1726773035.85560: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.85571: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.85573: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.85578: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.85748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.85940: done with get_vars() 8208 1726773035.85948: done getting variables 8208 1726773035.86024: done queuing things up, now waiting for results queue to drain 8208 1726773035.86029: results queue empty 8208 1726773035.86030: checking for any_errors_fatal 8208 1726773035.86033: done checking for any_errors_fatal 8208 1726773035.86033: checking for max_fail_percentage 8208 1726773035.86034: done checking for max_fail_percentage 8208 1726773035.86034: checking to see if all hosts have failed and the running result is not ok 8208 1726773035.86035: done checking to see if all hosts have failed 8208 1726773035.86035: getting the remaining hosts for this loop 8208 1726773035.86035: done getting the remaining hosts for this loop 8208 1726773035.86037: getting the next task for host managed_node1 8208 1726773035.86039: done getting next task for host managed_node1 8208 1726773035.86040: ^ task is: TASK: Verify no settings 8208 1726773035.86041: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773035.86043: getting variables 8208 1726773035.86044: in VariableManager get_vars() 8208 1726773035.86053: Calling all_inventory to load vars for managed_node1 8208 1726773035.86055: Calling groups_inventory to load vars for managed_node1 8208 1726773035.86056: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773035.86059: Calling all_plugins_play to load vars for managed_node1 8208 1726773035.86061: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773035.86062: Calling groups_plugins_play to load vars for managed_node1 8208 1726773035.86144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773035.86289: done with get_vars() 8208 1726773035.86296: done getting variables 8208 1726773035.86323: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify no settings] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 Thursday 19 September 2024 15:10:35 -0400 (0:00:00.031) 0:00:16.537 **** 8208 1726773035.86344: entering _queue_task() for managed_node1/shell 8208 1726773035.86534: worker is 1 (out of 1 available) 8208 1726773035.86549: exiting _queue_task() for managed_node1/shell 8208 1726773035.86560: done queuing things up, now waiting for results queue to drain 8208 1726773035.86563: waiting for pending results... 8729 1726773035.86680: running TaskExecutor() for managed_node1/TASK: Verify no settings 8729 1726773035.86798: in run() - task 0affffe7-6841-f581-0619-000000000098 8729 1726773035.86815: variable 'ansible_search_path' from source: unknown 8729 1726773035.86820: variable 'ansible_search_path' from source: unknown 8729 1726773035.86849: calling self._execute() 8729 1726773035.86919: variable 'ansible_host' from source: host vars for 'managed_node1' 8729 1726773035.86928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8729 1726773035.86936: variable 'omit' from source: magic vars 8729 1726773035.87021: variable 'omit' from source: magic vars 8729 1726773035.87051: variable 'omit' from source: magic vars 8729 1726773035.87319: variable '__kernel_settings_profile_filename' from source: role '' exported vars 8729 1726773035.87378: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8729 1726773035.87447: variable '__kernel_settings_profile_parent' from source: set_fact 8729 1726773035.87459: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8729 1726773035.87494: variable 'omit' from source: magic vars 8729 1726773035.87532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8729 1726773035.87562: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8729 1726773035.87582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8729 1726773035.87611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8729 1726773035.87624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8729 1726773035.87650: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8729 1726773035.87655: variable 'ansible_host' from source: host vars for 'managed_node1' 8729 1726773035.87658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8729 1726773035.87730: Set connection var ansible_shell_executable to /bin/sh 8729 1726773035.87735: Set connection var ansible_connection to ssh 8729 1726773035.87739: Set connection var ansible_module_compression to ZIP_DEFLATED 8729 1726773035.87744: Set connection var ansible_timeout to 10 8729 1726773035.87746: Set connection var ansible_shell_type to sh 8729 1726773035.87752: Set connection var ansible_pipelining to False 8729 1726773035.87774: variable 'ansible_shell_executable' from source: unknown 8729 1726773035.87780: variable 'ansible_connection' from source: unknown 8729 1726773035.87782: variable 'ansible_module_compression' from source: unknown 8729 1726773035.87784: variable 'ansible_shell_type' from source: unknown 8729 1726773035.87787: variable 'ansible_shell_executable' from source: unknown 8729 1726773035.87790: variable 'ansible_host' from source: host vars for 'managed_node1' 8729 1726773035.87792: variable 'ansible_pipelining' from source: unknown 8729 1726773035.87794: variable 'ansible_timeout' from source: unknown 8729 1726773035.87796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8729 1726773035.87900: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8729 1726773035.87910: variable 'omit' from source: magic vars 8729 1726773035.87913: starting attempt loop 8729 1726773035.87915: running the handler 8729 1726773035.87921: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8729 1726773035.87934: _low_level_execute_command(): starting 8729 1726773035.87940: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8729 1726773035.90434: stdout chunk (state=2): >>>/root <<< 8729 1726773035.90557: stderr chunk (state=3): >>><<< 8729 1726773035.90567: stdout chunk (state=3): >>><<< 8729 1726773035.90592: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8729 1726773035.90607: _low_level_execute_command(): starting 8729 1726773035.90614: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079 `" && echo ansible-tmp-1726773035.9060152-8729-91712692000079="` echo /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079 `" ) && sleep 0' 8729 1726773035.93207: stdout chunk (state=2): >>>ansible-tmp-1726773035.9060152-8729-91712692000079=/root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079 <<< 8729 1726773035.93350: stderr chunk (state=3): >>><<< 8729 1726773035.93358: stdout chunk (state=3): >>><<< 8729 1726773035.93379: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773035.9060152-8729-91712692000079=/root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079 , stderr= 8729 1726773035.93419: variable 'ansible_module_compression' from source: unknown 8729 1726773035.93464: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 8729 1726773035.93503: variable 'ansible_facts' from source: unknown 8729 1726773035.93597: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/AnsiballZ_command.py 8729 1726773035.94004: Sending initial data 8729 1726773035.94011: Sent initial data (153 bytes) 8729 1726773035.96434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpdbbgs39v /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/AnsiballZ_command.py <<< 8729 1726773035.97864: stderr chunk (state=3): >>><<< 8729 1726773035.97877: stdout chunk (state=3): >>><<< 8729 1726773035.97899: done transferring module to remote 8729 1726773035.97910: _low_level_execute_command(): starting 8729 1726773035.97917: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/ /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/AnsiballZ_command.py && sleep 0' 8729 1726773036.00368: stderr chunk (state=2): >>><<< 8729 1726773036.00379: stdout chunk (state=2): >>><<< 8729 1726773036.00397: _low_level_execute_command() done: rc=0, stdout=, stderr= 8729 1726773036.00402: _low_level_execute_command(): starting 8729 1726773036.00408: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/AnsiballZ_command.py && sleep 0' 8729 1726773036.16031: stdout chunk (state=2): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:10:36.150979", "end": "2024-09-19 15:10:36.158336", "delta": "0:00:00.007357", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 8729 1726773036.17186: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8729 1726773036.17200: stdout chunk (state=3): >>><<< 8729 1726773036.17213: stderr chunk (state=3): >>><<< 8729 1726773036.17227: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ conf=/etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysctl\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[sysfs\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[systemd\\]' /etc/tuned/kernel_settings/tuned.conf\n+ for section in sysctl sysfs systemd vm\n+ grep '^\\[vm\\]' /etc/tuned/kernel_settings/tuned.conf\n+ exit 0", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "start": "2024-09-19 15:10:36.150979", "end": "2024-09-19 15:10:36.158336", "delta": "0:00:00.007357", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8729 1726773036.17255: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\["$section"\\\\] "$conf"; then\n echo ERROR: "$section" settings present\n rc=1\n fi\ndone\nexit "$rc"\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8729 1726773036.17265: _low_level_execute_command(): starting 8729 1726773036.17271: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773035.9060152-8729-91712692000079/ > /dev/null 2>&1 && sleep 0' 8729 1726773036.19913: stderr chunk (state=2): >>><<< 8729 1726773036.19923: stdout chunk (state=2): >>><<< 8729 1726773036.19939: _low_level_execute_command() done: rc=0, stdout=, stderr= 8729 1726773036.19949: handler run complete 8729 1726773036.19971: Evaluated conditional (False): False 8729 1726773036.19982: attempt loop complete, returning result 8729 1726773036.19988: _execute() done 8729 1726773036.19992: dumping result to json 8729 1726773036.19998: done dumping result, returning 8729 1726773036.20004: done running TaskExecutor() for managed_node1/TASK: Verify no settings [0affffe7-6841-f581-0619-000000000098] 8729 1726773036.20011: sending task result for task 0affffe7-6841-f581-0619-000000000098 8729 1726773036.20041: done sending task result for task 0affffe7-6841-f581-0619-000000000098 8729 1726773036.20045: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nconf=/etc/tuned/kernel_settings/tuned.conf\nfor section in sysctl sysfs systemd vm; do\n if grep ^\\\\[\"$section\"\\\\] \"$conf\"; then\n echo ERROR: \"$section\" settings present\n rc=1\n fi\ndone\nexit \"$rc\"\n", "delta": "0:00:00.007357", "end": "2024-09-19 15:10:36.158336", "rc": 0, "start": "2024-09-19 15:10:36.150979" } STDERR: + exec + rc=0 + conf=/etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysctl\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[sysfs\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[systemd\]' /etc/tuned/kernel_settings/tuned.conf + for section in sysctl sysfs systemd vm + grep '^\[vm\]' /etc/tuned/kernel_settings/tuned.conf + exit 0 8208 1726773036.20255: no more pending results, returning what we have 8208 1726773036.20258: results queue empty 8208 1726773036.20258: checking for any_errors_fatal 8208 1726773036.20260: done checking for any_errors_fatal 8208 1726773036.20261: checking for max_fail_percentage 8208 1726773036.20262: done checking for max_fail_percentage 8208 1726773036.20263: checking to see if all hosts have failed and the running result is not ok 8208 1726773036.20263: done checking to see if all hosts have failed 8208 1726773036.20264: getting the remaining hosts for this loop 8208 1726773036.20265: done getting the remaining hosts for this loop 8208 1726773036.20269: getting the next task for host managed_node1 8208 1726773036.20275: done getting next task for host managed_node1 8208 1726773036.20276: ^ task is: TASK: Remove kernel_settings tuned profile 8208 1726773036.20278: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773036.20281: getting variables 8208 1726773036.20282: in VariableManager get_vars() 8208 1726773036.20319: Calling all_inventory to load vars for managed_node1 8208 1726773036.20321: Calling groups_inventory to load vars for managed_node1 8208 1726773036.20322: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773036.20330: Calling all_plugins_play to load vars for managed_node1 8208 1726773036.20335: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773036.20338: Calling groups_plugins_play to load vars for managed_node1 8208 1726773036.20451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773036.20568: done with get_vars() 8208 1726773036.20577: done getting variables TASK [Remove kernel_settings tuned profile] ************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.342) 0:00:16.880 **** 8208 1726773036.20646: entering _queue_task() for managed_node1/file 8208 1726773036.20849: worker is 1 (out of 1 available) 8208 1726773036.20863: exiting _queue_task() for managed_node1/file 8208 1726773036.20873: done queuing things up, now waiting for results queue to drain 8208 1726773036.20875: waiting for pending results... 8750 1726773036.21086: running TaskExecutor() for managed_node1/TASK: Remove kernel_settings tuned profile 8750 1726773036.21209: in run() - task 0affffe7-6841-f581-0619-000000000099 8750 1726773036.21233: variable 'ansible_search_path' from source: unknown 8750 1726773036.21237: variable 'ansible_search_path' from source: unknown 8750 1726773036.21270: calling self._execute() 8750 1726773036.21364: variable 'ansible_host' from source: host vars for 'managed_node1' 8750 1726773036.21373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8750 1726773036.21381: variable 'omit' from source: magic vars 8750 1726773036.21499: variable 'omit' from source: magic vars 8750 1726773036.21544: variable 'omit' from source: magic vars 8750 1726773036.21577: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8750 1726773036.21861: variable '__kernel_settings_profile_dir' from source: role '' exported vars 8750 1726773036.21937: variable '__kernel_settings_profile_parent' from source: set_fact 8750 1726773036.21944: variable '__kernel_settings_tuned_profile' from source: role '' exported vars 8750 1726773036.21971: variable 'omit' from source: magic vars 8750 1726773036.22198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8750 1726773036.22308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8750 1726773036.22331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8750 1726773036.22349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8750 1726773036.22361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8750 1726773036.22391: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8750 1726773036.22397: variable 'ansible_host' from source: host vars for 'managed_node1' 8750 1726773036.22401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8750 1726773036.22496: Set connection var ansible_shell_executable to /bin/sh 8750 1726773036.22502: Set connection var ansible_connection to ssh 8750 1726773036.22509: Set connection var ansible_module_compression to ZIP_DEFLATED 8750 1726773036.22516: Set connection var ansible_timeout to 10 8750 1726773036.22520: Set connection var ansible_shell_type to sh 8750 1726773036.22527: Set connection var ansible_pipelining to False 8750 1726773036.22550: variable 'ansible_shell_executable' from source: unknown 8750 1726773036.22555: variable 'ansible_connection' from source: unknown 8750 1726773036.22558: variable 'ansible_module_compression' from source: unknown 8750 1726773036.22561: variable 'ansible_shell_type' from source: unknown 8750 1726773036.22564: variable 'ansible_shell_executable' from source: unknown 8750 1726773036.22566: variable 'ansible_host' from source: host vars for 'managed_node1' 8750 1726773036.22570: variable 'ansible_pipelining' from source: unknown 8750 1726773036.22573: variable 'ansible_timeout' from source: unknown 8750 1726773036.22577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8750 1726773036.22759: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8750 1726773036.22771: variable 'omit' from source: magic vars 8750 1726773036.22777: starting attempt loop 8750 1726773036.22780: running the handler 8750 1726773036.22795: _low_level_execute_command(): starting 8750 1726773036.22803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8750 1726773036.25476: stdout chunk (state=2): >>>/root <<< 8750 1726773036.25624: stderr chunk (state=3): >>><<< 8750 1726773036.25635: stdout chunk (state=3): >>><<< 8750 1726773036.25659: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8750 1726773036.25677: _low_level_execute_command(): starting 8750 1726773036.25687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609 `" && echo ansible-tmp-1726773036.2567153-8750-63648527811609="` echo /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609 `" ) && sleep 0' 8750 1726773036.28447: stdout chunk (state=2): >>>ansible-tmp-1726773036.2567153-8750-63648527811609=/root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609 <<< 8750 1726773036.28804: stderr chunk (state=3): >>><<< 8750 1726773036.28814: stdout chunk (state=3): >>><<< 8750 1726773036.28833: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.2567153-8750-63648527811609=/root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609 , stderr= 8750 1726773036.28882: variable 'ansible_module_compression' from source: unknown 8750 1726773036.28939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.file-ZIP_DEFLATED 8750 1726773036.28978: variable 'ansible_facts' from source: unknown 8750 1726773036.29084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/AnsiballZ_file.py 8750 1726773036.30311: Sending initial data 8750 1726773036.30322: Sent initial data (150 bytes) 8750 1726773036.32917: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp6hdt_jcz /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/AnsiballZ_file.py <<< 8750 1726773036.34703: stderr chunk (state=3): >>><<< 8750 1726773036.34716: stdout chunk (state=3): >>><<< 8750 1726773036.34743: done transferring module to remote 8750 1726773036.34757: _low_level_execute_command(): starting 8750 1726773036.34764: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/ /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/AnsiballZ_file.py && sleep 0' 8750 1726773036.37470: stderr chunk (state=2): >>><<< 8750 1726773036.37481: stdout chunk (state=2): >>><<< 8750 1726773036.37498: _low_level_execute_command() done: rc=0, stdout=, stderr= 8750 1726773036.37503: _low_level_execute_command(): starting 8750 1726773036.37509: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/AnsiballZ_file.py && sleep 0' 8750 1726773036.53154: stdout chunk (state=2): >>> {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8750 1726773036.54054: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8750 1726773036.54104: stderr chunk (state=3): >>><<< 8750 1726773036.54111: stdout chunk (state=3): >>><<< 8750 1726773036.54129: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/tuned/kernel_settings", "changed": true, "diff": {"before": {"path": "/etc/tuned/kernel_settings", "state": "directory", "path_content": {"directories": [], "files": ["/etc/tuned/kernel_settings/tuned.conf"]}}, "after": {"path": "/etc/tuned/kernel_settings", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"path": "/etc/tuned/kernel_settings", "state": "absent", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8750 1726773036.54159: done with _execute_module (file, {'path': '/etc/tuned/kernel_settings', 'state': 'absent', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8750 1726773036.54170: _low_level_execute_command(): starting 8750 1726773036.54176: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.2567153-8750-63648527811609/ > /dev/null 2>&1 && sleep 0' 8750 1726773036.56900: stderr chunk (state=2): >>><<< 8750 1726773036.56910: stdout chunk (state=2): >>><<< 8750 1726773036.56926: _low_level_execute_command() done: rc=0, stdout=, stderr= 8750 1726773036.56932: handler run complete 8750 1726773036.56957: attempt loop complete, returning result 8750 1726773036.56961: _execute() done 8750 1726773036.56964: dumping result to json 8750 1726773036.56971: done dumping result, returning 8750 1726773036.56977: done running TaskExecutor() for managed_node1/TASK: Remove kernel_settings tuned profile [0affffe7-6841-f581-0619-000000000099] 8750 1726773036.56982: sending task result for task 0affffe7-6841-f581-0619-000000000099 8750 1726773036.57021: done sending task result for task 0affffe7-6841-f581-0619-000000000099 8750 1726773036.57024: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "path": "/etc/tuned/kernel_settings", "state": "absent" } 8208 1726773036.57435: no more pending results, returning what we have 8208 1726773036.57438: results queue empty 8208 1726773036.57439: checking for any_errors_fatal 8208 1726773036.57445: done checking for any_errors_fatal 8208 1726773036.57445: checking for max_fail_percentage 8208 1726773036.57447: done checking for max_fail_percentage 8208 1726773036.57448: checking to see if all hosts have failed and the running result is not ok 8208 1726773036.57448: done checking to see if all hosts have failed 8208 1726773036.57449: getting the remaining hosts for this loop 8208 1726773036.57450: done getting the remaining hosts for this loop 8208 1726773036.57453: getting the next task for host managed_node1 8208 1726773036.57458: done getting next task for host managed_node1 8208 1726773036.57461: ^ task is: TASK: Get active_profile 8208 1726773036.57463: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773036.57469: getting variables 8208 1726773036.57470: in VariableManager get_vars() 8208 1726773036.57506: Calling all_inventory to load vars for managed_node1 8208 1726773036.57509: Calling groups_inventory to load vars for managed_node1 8208 1726773036.57511: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773036.57519: Calling all_plugins_play to load vars for managed_node1 8208 1726773036.57521: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773036.57523: Calling groups_plugins_play to load vars for managed_node1 8208 1726773036.57744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773036.57940: done with get_vars() 8208 1726773036.57951: done getting variables TASK [Get active_profile] ****************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.373) 0:00:17.254 **** 8208 1726773036.58048: entering _queue_task() for managed_node1/slurp 8208 1726773036.58291: worker is 1 (out of 1 available) 8208 1726773036.58307: exiting _queue_task() for managed_node1/slurp 8208 1726773036.58318: done queuing things up, now waiting for results queue to drain 8208 1726773036.58320: waiting for pending results... 8775 1726773036.58803: running TaskExecutor() for managed_node1/TASK: Get active_profile 8775 1726773036.58930: in run() - task 0affffe7-6841-f581-0619-00000000009a 8775 1726773036.58948: variable 'ansible_search_path' from source: unknown 8775 1726773036.58953: variable 'ansible_search_path' from source: unknown 8775 1726773036.58991: calling self._execute() 8775 1726773036.59073: variable 'ansible_host' from source: host vars for 'managed_node1' 8775 1726773036.59083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8775 1726773036.59094: variable 'omit' from source: magic vars 8775 1726773036.59203: variable 'omit' from source: magic vars 8775 1726773036.59242: variable 'omit' from source: magic vars 8775 1726773036.59271: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8775 1726773036.59567: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8775 1726773036.59649: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8775 1726773036.59684: variable 'omit' from source: magic vars 8775 1726773036.59780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8775 1726773036.59819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8775 1726773036.59840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8775 1726773036.59860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8775 1726773036.59875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8775 1726773036.59905: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8775 1726773036.59910: variable 'ansible_host' from source: host vars for 'managed_node1' 8775 1726773036.59915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8775 1726773036.60017: Set connection var ansible_shell_executable to /bin/sh 8775 1726773036.60027: Set connection var ansible_connection to ssh 8775 1726773036.60037: Set connection var ansible_module_compression to ZIP_DEFLATED 8775 1726773036.60046: Set connection var ansible_timeout to 10 8775 1726773036.60049: Set connection var ansible_shell_type to sh 8775 1726773036.60057: Set connection var ansible_pipelining to False 8775 1726773036.60081: variable 'ansible_shell_executable' from source: unknown 8775 1726773036.60088: variable 'ansible_connection' from source: unknown 8775 1726773036.60091: variable 'ansible_module_compression' from source: unknown 8775 1726773036.60095: variable 'ansible_shell_type' from source: unknown 8775 1726773036.60098: variable 'ansible_shell_executable' from source: unknown 8775 1726773036.60101: variable 'ansible_host' from source: host vars for 'managed_node1' 8775 1726773036.60105: variable 'ansible_pipelining' from source: unknown 8775 1726773036.60108: variable 'ansible_timeout' from source: unknown 8775 1726773036.60111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8775 1726773036.60299: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 8775 1726773036.60311: variable 'omit' from source: magic vars 8775 1726773036.60317: starting attempt loop 8775 1726773036.60320: running the handler 8775 1726773036.60333: _low_level_execute_command(): starting 8775 1726773036.60342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8775 1726773036.63695: stdout chunk (state=2): >>>/root <<< 8775 1726773036.63880: stderr chunk (state=3): >>><<< 8775 1726773036.63891: stdout chunk (state=3): >>><<< 8775 1726773036.63913: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8775 1726773036.63929: _low_level_execute_command(): starting 8775 1726773036.63937: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363 `" && echo ansible-tmp-1726773036.6392233-8775-43038061476363="` echo /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363 `" ) && sleep 0' 8775 1726773036.67109: stdout chunk (state=2): >>>ansible-tmp-1726773036.6392233-8775-43038061476363=/root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363 <<< 8775 1726773036.67232: stderr chunk (state=3): >>><<< 8775 1726773036.67242: stdout chunk (state=3): >>><<< 8775 1726773036.67261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773036.6392233-8775-43038061476363=/root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363 , stderr= 8775 1726773036.67314: variable 'ansible_module_compression' from source: unknown 8775 1726773036.67357: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.slurp-ZIP_DEFLATED 8775 1726773036.67397: variable 'ansible_facts' from source: unknown 8775 1726773036.67498: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/AnsiballZ_slurp.py 8775 1726773036.68827: Sending initial data 8775 1726773036.68836: Sent initial data (151 bytes) 8775 1726773036.71754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpur_s5x_a /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/AnsiballZ_slurp.py <<< 8775 1726773036.73272: stderr chunk (state=3): >>><<< 8775 1726773036.73282: stdout chunk (state=3): >>><<< 8775 1726773036.73309: done transferring module to remote 8775 1726773036.73322: _low_level_execute_command(): starting 8775 1726773036.73327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/ /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/AnsiballZ_slurp.py && sleep 0' 8775 1726773036.75940: stderr chunk (state=2): >>><<< 8775 1726773036.75952: stdout chunk (state=2): >>><<< 8775 1726773036.75971: _low_level_execute_command() done: rc=0, stdout=, stderr= 8775 1726773036.75976: _low_level_execute_command(): starting 8775 1726773036.75981: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/AnsiballZ_slurp.py && sleep 0' 8775 1726773036.90903: stdout chunk (state=2): >>> {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} <<< 8775 1726773036.91882: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8775 1726773036.91932: stderr chunk (state=3): >>><<< 8775 1726773036.91940: stdout chunk (state=3): >>><<< 8775 1726773036.91956: _low_level_execute_command() done: rc=0, stdout= {"content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "source": "/etc/tuned/active_profile", "encoding": "base64", "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "src": "/etc/tuned/active_profile"}}} , stderr=Shared connection to 10.31.43.7 closed. 8775 1726773036.91977: done with _execute_module (slurp, {'path': '/etc/tuned/active_profile', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'slurp', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8775 1726773036.91990: _low_level_execute_command(): starting 8775 1726773036.91997: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773036.6392233-8775-43038061476363/ > /dev/null 2>&1 && sleep 0' 8775 1726773036.94561: stderr chunk (state=2): >>><<< 8775 1726773036.94573: stdout chunk (state=2): >>><<< 8775 1726773036.94590: _low_level_execute_command() done: rc=0, stdout=, stderr= 8775 1726773036.94599: handler run complete 8775 1726773036.94612: attempt loop complete, returning result 8775 1726773036.94615: _execute() done 8775 1726773036.94619: dumping result to json 8775 1726773036.94624: done dumping result, returning 8775 1726773036.94631: done running TaskExecutor() for managed_node1/TASK: Get active_profile [0affffe7-6841-f581-0619-00000000009a] 8775 1726773036.94638: sending task result for task 0affffe7-6841-f581-0619-00000000009a 8775 1726773036.94668: done sending task result for task 0affffe7-6841-f581-0619-00000000009a 8775 1726773036.94672: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "content": "dmlydHVhbC1ndWVzdCBrZXJuZWxfc2V0dGluZ3MK", "encoding": "base64", "source": "/etc/tuned/active_profile" } 8208 1726773036.94804: no more pending results, returning what we have 8208 1726773036.94807: results queue empty 8208 1726773036.94808: checking for any_errors_fatal 8208 1726773036.94816: done checking for any_errors_fatal 8208 1726773036.94816: checking for max_fail_percentage 8208 1726773036.94818: done checking for max_fail_percentage 8208 1726773036.94818: checking to see if all hosts have failed and the running result is not ok 8208 1726773036.94819: done checking to see if all hosts have failed 8208 1726773036.94819: getting the remaining hosts for this loop 8208 1726773036.94821: done getting the remaining hosts for this loop 8208 1726773036.94824: getting the next task for host managed_node1 8208 1726773036.94829: done getting next task for host managed_node1 8208 1726773036.94831: ^ task is: TASK: Ensure kernel_settings is not in active_profile 8208 1726773036.94833: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773036.94836: getting variables 8208 1726773036.94837: in VariableManager get_vars() 8208 1726773036.94873: Calling all_inventory to load vars for managed_node1 8208 1726773036.94876: Calling groups_inventory to load vars for managed_node1 8208 1726773036.94878: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773036.94889: Calling all_plugins_play to load vars for managed_node1 8208 1726773036.94892: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773036.94894: Calling groups_plugins_play to load vars for managed_node1 8208 1726773036.95021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773036.95140: done with get_vars() 8208 1726773036.95150: done getting variables 8208 1726773036.95199: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure kernel_settings is not in active_profile] ************************* task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Thursday 19 September 2024 15:10:36 -0400 (0:00:00.371) 0:00:17.626 **** 8208 1726773036.95221: entering _queue_task() for managed_node1/copy 8208 1726773036.95400: worker is 1 (out of 1 available) 8208 1726773036.95415: exiting _queue_task() for managed_node1/copy 8208 1726773036.95427: done queuing things up, now waiting for results queue to drain 8208 1726773036.95429: waiting for pending results... 8800 1726773036.95550: running TaskExecutor() for managed_node1/TASK: Ensure kernel_settings is not in active_profile 8800 1726773036.95654: in run() - task 0affffe7-6841-f581-0619-00000000009b 8800 1726773036.95670: variable 'ansible_search_path' from source: unknown 8800 1726773036.95674: variable 'ansible_search_path' from source: unknown 8800 1726773036.95712: calling self._execute() 8800 1726773036.95789: variable 'ansible_host' from source: host vars for 'managed_node1' 8800 1726773036.95797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8800 1726773036.95805: variable 'omit' from source: magic vars 8800 1726773036.95901: variable 'omit' from source: magic vars 8800 1726773036.95932: variable 'omit' from source: magic vars 8800 1726773036.95960: variable '__active_profile' from source: task vars 8800 1726773036.96247: variable '__active_profile' from source: task vars 8800 1726773036.96481: variable '__cur_profile' from source: task vars 8800 1726773036.96635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 8800 1726773036.98860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 8800 1726773036.98946: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 8800 1726773036.98989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 8800 1726773036.99024: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 8800 1726773036.99049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 8800 1726773036.99126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 8800 1726773036.99169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 8800 1726773036.99196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 8800 1726773036.99236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 8800 1726773036.99251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 8800 1726773036.99363: variable '__kernel_settings_tuned_current_profile' from source: set_fact 8800 1726773036.99416: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8800 1726773036.99472: variable '__kernel_settings_tuned_active_profile' from source: role '' exported vars 8800 1726773036.99525: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8800 1726773036.99548: variable 'omit' from source: magic vars 8800 1726773036.99573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8800 1726773036.99596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8800 1726773036.99611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8800 1726773036.99625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8800 1726773036.99635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8800 1726773036.99660: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8800 1726773036.99668: variable 'ansible_host' from source: host vars for 'managed_node1' 8800 1726773036.99673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8800 1726773036.99741: Set connection var ansible_shell_executable to /bin/sh 8800 1726773036.99747: Set connection var ansible_connection to ssh 8800 1726773036.99753: Set connection var ansible_module_compression to ZIP_DEFLATED 8800 1726773036.99761: Set connection var ansible_timeout to 10 8800 1726773036.99764: Set connection var ansible_shell_type to sh 8800 1726773036.99771: Set connection var ansible_pipelining to False 8800 1726773036.99790: variable 'ansible_shell_executable' from source: unknown 8800 1726773036.99794: variable 'ansible_connection' from source: unknown 8800 1726773036.99798: variable 'ansible_module_compression' from source: unknown 8800 1726773036.99801: variable 'ansible_shell_type' from source: unknown 8800 1726773036.99804: variable 'ansible_shell_executable' from source: unknown 8800 1726773036.99808: variable 'ansible_host' from source: host vars for 'managed_node1' 8800 1726773036.99812: variable 'ansible_pipelining' from source: unknown 8800 1726773036.99815: variable 'ansible_timeout' from source: unknown 8800 1726773036.99819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8800 1726773036.99886: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8800 1726773036.99897: variable 'omit' from source: magic vars 8800 1726773036.99903: starting attempt loop 8800 1726773036.99906: running the handler 8800 1726773036.99916: _low_level_execute_command(): starting 8800 1726773036.99924: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8800 1726773037.02485: stdout chunk (state=2): >>>/root <<< 8800 1726773037.02748: stderr chunk (state=3): >>><<< 8800 1726773037.02757: stdout chunk (state=3): >>><<< 8800 1726773037.02781: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8800 1726773037.02796: _low_level_execute_command(): starting 8800 1726773037.02803: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288 `" && echo ansible-tmp-1726773037.0279095-8800-43085015507288="` echo /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288 `" ) && sleep 0' 8800 1726773037.05357: stdout chunk (state=2): >>>ansible-tmp-1726773037.0279095-8800-43085015507288=/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288 <<< 8800 1726773037.05481: stderr chunk (state=3): >>><<< 8800 1726773037.05490: stdout chunk (state=3): >>><<< 8800 1726773037.05505: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.0279095-8800-43085015507288=/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288 , stderr= 8800 1726773037.05577: variable 'ansible_module_compression' from source: unknown 8800 1726773037.05621: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8800 1726773037.05648: variable 'ansible_facts' from source: unknown 8800 1726773037.05718: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_stat.py 8800 1726773037.05807: Sending initial data 8800 1726773037.05814: Sent initial data (150 bytes) 8800 1726773037.08444: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp39e1inuh /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_stat.py <<< 8800 1726773037.09868: stderr chunk (state=3): >>><<< 8800 1726773037.09883: stdout chunk (state=3): >>><<< 8800 1726773037.09909: done transferring module to remote 8800 1726773037.09920: _low_level_execute_command(): starting 8800 1726773037.09925: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/ /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_stat.py && sleep 0' 8800 1726773037.12479: stderr chunk (state=2): >>><<< 8800 1726773037.12489: stdout chunk (state=2): >>><<< 8800 1726773037.12504: _low_level_execute_command() done: rc=0, stdout=, stderr= 8800 1726773037.12508: _low_level_execute_command(): starting 8800 1726773037.12513: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_stat.py && sleep 0' 8800 1726773037.28293: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 494928066, "dev": 51713, "nlink": 1, "atime": 1726773036.9074695, "mtime": 1726773035.0464566, "ctime": 1726773035.0464566, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3672758599", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8800 1726773037.29648: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8800 1726773037.29658: stdout chunk (state=3): >>><<< 8800 1726773037.29670: stderr chunk (state=3): >>><<< 8800 1726773037.29693: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/active_profile", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 30, "inode": 494928066, "dev": 51713, "nlink": 1, "atime": 1726773036.9074695, "mtime": 1726773035.0464566, "ctime": 1726773035.0464566, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "a79569d3860cb6a066e0e92c8b22ffd0e8796bfd", "mimetype": "text/plain", "charset": "us-ascii", "version": "3672758599", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/active_profile", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.43.7 closed. 8800 1726773037.29731: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/active_profile', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8800 1726773037.29822: Sending initial data 8800 1726773037.29830: Sent initial data (139 bytes) 8800 1726773037.34600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpglk7_1bu /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source <<< 8800 1726773037.35628: stderr chunk (state=3): >>><<< 8800 1726773037.35639: stdout chunk (state=3): >>><<< 8800 1726773037.35664: _low_level_execute_command(): starting 8800 1726773037.35672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/ /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source && sleep 0' 8800 1726773037.38963: stderr chunk (state=2): >>><<< 8800 1726773037.38973: stdout chunk (state=2): >>><<< 8800 1726773037.38993: _low_level_execute_command() done: rc=0, stdout=, stderr= 8800 1726773037.39018: variable 'ansible_module_compression' from source: unknown 8800 1726773037.39066: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8800 1726773037.39088: variable 'ansible_facts' from source: unknown 8800 1726773037.39178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_copy.py 8800 1726773037.39656: Sending initial data 8800 1726773037.39663: Sent initial data (150 bytes) 8800 1726773037.43088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmp6dxswyji /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_copy.py <<< 8800 1726773037.44536: stderr chunk (state=3): >>><<< 8800 1726773037.44545: stdout chunk (state=3): >>><<< 8800 1726773037.44566: done transferring module to remote 8800 1726773037.44577: _low_level_execute_command(): starting 8800 1726773037.44582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/ /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_copy.py && sleep 0' 8800 1726773037.47071: stderr chunk (state=2): >>><<< 8800 1726773037.47081: stdout chunk (state=2): >>><<< 8800 1726773037.47098: _low_level_execute_command() done: rc=0, stdout=, stderr= 8800 1726773037.47102: _low_level_execute_command(): starting 8800 1726773037.47109: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/AnsiballZ_copy.py && sleep 0' 8800 1726773037.63137: stdout chunk (state=2): >>> {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source", "_original_basename": "tmpglk7_1bu", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8800 1726773037.64263: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8800 1726773037.64314: stderr chunk (state=3): >>><<< 8800 1726773037.64323: stdout chunk (state=3): >>><<< 8800 1726773037.64341: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/active_profile", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "invocation": {"module_args": {"dest": "/etc/tuned/active_profile", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source", "_original_basename": "tmpglk7_1bu", "follow": false, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8800 1726773037.64371: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/active_profile', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source', '_original_basename': 'tmpglk7_1bu', 'follow': False, 'checksum': '633f07e1b5698d04352d5dca735869bf2fe77897', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8800 1726773037.64382: _low_level_execute_command(): starting 8800 1726773037.64389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/ > /dev/null 2>&1 && sleep 0' 8800 1726773037.66891: stderr chunk (state=2): >>><<< 8800 1726773037.66903: stdout chunk (state=2): >>><<< 8800 1726773037.66917: _low_level_execute_command() done: rc=0, stdout=, stderr= 8800 1726773037.66925: handler run complete 8800 1726773037.66944: attempt loop complete, returning result 8800 1726773037.66947: _execute() done 8800 1726773037.66951: dumping result to json 8800 1726773037.66956: done dumping result, returning 8800 1726773037.66963: done running TaskExecutor() for managed_node1/TASK: Ensure kernel_settings is not in active_profile [0affffe7-6841-f581-0619-00000000009b] 8800 1726773037.66971: sending task result for task 0affffe7-6841-f581-0619-00000000009b 8800 1726773037.67003: done sending task result for task 0affffe7-6841-f581-0619-00000000009b 8800 1726773037.67006: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "633f07e1b5698d04352d5dca735869bf2fe77897", "dest": "/etc/tuned/active_profile", "gid": 0, "group": "root", "md5sum": "9a561d913bcdb5a659ec2dd035975a8e", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_rw_etc_t:s0", "size": 14, "src": "/root/.ansible/tmp/ansible-tmp-1726773037.0279095-8800-43085015507288/source", "state": "file", "uid": 0 } 8208 1726773037.67166: no more pending results, returning what we have 8208 1726773037.67170: results queue empty 8208 1726773037.67170: checking for any_errors_fatal 8208 1726773037.67176: done checking for any_errors_fatal 8208 1726773037.67177: checking for max_fail_percentage 8208 1726773037.67178: done checking for max_fail_percentage 8208 1726773037.67178: checking to see if all hosts have failed and the running result is not ok 8208 1726773037.67179: done checking to see if all hosts have failed 8208 1726773037.67180: getting the remaining hosts for this loop 8208 1726773037.67181: done getting the remaining hosts for this loop 8208 1726773037.67184: getting the next task for host managed_node1 8208 1726773037.67191: done getting next task for host managed_node1 8208 1726773037.67193: ^ task is: TASK: Set profile_mode to auto 8208 1726773037.67195: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773037.67200: getting variables 8208 1726773037.67202: in VariableManager get_vars() 8208 1726773037.67235: Calling all_inventory to load vars for managed_node1 8208 1726773037.67237: Calling groups_inventory to load vars for managed_node1 8208 1726773037.67239: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773037.67248: Calling all_plugins_play to load vars for managed_node1 8208 1726773037.67251: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773037.67253: Calling groups_plugins_play to load vars for managed_node1 8208 1726773037.67412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773037.67525: done with get_vars() 8208 1726773037.67534: done getting variables 8208 1726773037.67577: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set profile_mode to auto] ************************************************ task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 Thursday 19 September 2024 15:10:37 -0400 (0:00:00.723) 0:00:18.350 **** 8208 1726773037.67602: entering _queue_task() for managed_node1/copy 8208 1726773037.67772: worker is 1 (out of 1 available) 8208 1726773037.67789: exiting _queue_task() for managed_node1/copy 8208 1726773037.67802: done queuing things up, now waiting for results queue to drain 8208 1726773037.67804: waiting for pending results... 8848 1726773037.67927: running TaskExecutor() for managed_node1/TASK: Set profile_mode to auto 8848 1726773037.68031: in run() - task 0affffe7-6841-f581-0619-00000000009c 8848 1726773037.68050: variable 'ansible_search_path' from source: unknown 8848 1726773037.68054: variable 'ansible_search_path' from source: unknown 8848 1726773037.68087: calling self._execute() 8848 1726773037.68152: variable 'ansible_host' from source: host vars for 'managed_node1' 8848 1726773037.68161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8848 1726773037.68172: variable 'omit' from source: magic vars 8848 1726773037.68252: variable 'omit' from source: magic vars 8848 1726773037.68289: variable 'omit' from source: magic vars 8848 1726773037.68310: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 8848 1726773037.68537: variable '__kernel_settings_tuned_profile_mode' from source: role '' exported vars 8848 1726773037.68605: variable '__kernel_settings_tuned_dir' from source: role '' exported vars 8848 1726773037.68631: variable 'omit' from source: magic vars 8848 1726773037.68663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8848 1726773037.68697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8848 1726773037.68716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8848 1726773037.68730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8848 1726773037.68743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8848 1726773037.68770: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8848 1726773037.68775: variable 'ansible_host' from source: host vars for 'managed_node1' 8848 1726773037.68779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8848 1726773037.68847: Set connection var ansible_shell_executable to /bin/sh 8848 1726773037.68853: Set connection var ansible_connection to ssh 8848 1726773037.68860: Set connection var ansible_module_compression to ZIP_DEFLATED 8848 1726773037.68870: Set connection var ansible_timeout to 10 8848 1726773037.68874: Set connection var ansible_shell_type to sh 8848 1726773037.68880: Set connection var ansible_pipelining to False 8848 1726773037.68899: variable 'ansible_shell_executable' from source: unknown 8848 1726773037.68904: variable 'ansible_connection' from source: unknown 8848 1726773037.68907: variable 'ansible_module_compression' from source: unknown 8848 1726773037.68910: variable 'ansible_shell_type' from source: unknown 8848 1726773037.68913: variable 'ansible_shell_executable' from source: unknown 8848 1726773037.68916: variable 'ansible_host' from source: host vars for 'managed_node1' 8848 1726773037.68920: variable 'ansible_pipelining' from source: unknown 8848 1726773037.68922: variable 'ansible_timeout' from source: unknown 8848 1726773037.68924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8848 1726773037.69027: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8848 1726773037.69038: variable 'omit' from source: magic vars 8848 1726773037.69042: starting attempt loop 8848 1726773037.69044: running the handler 8848 1726773037.69054: _low_level_execute_command(): starting 8848 1726773037.69060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8848 1726773037.71527: stdout chunk (state=2): >>>/root <<< 8848 1726773037.72101: stderr chunk (state=3): >>><<< 8848 1726773037.72112: stdout chunk (state=3): >>><<< 8848 1726773037.72139: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8848 1726773037.72155: _low_level_execute_command(): starting 8848 1726773037.72162: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579 `" && echo ansible-tmp-1726773037.721496-8848-51922528233579="` echo /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579 `" ) && sleep 0' 8848 1726773037.75193: stdout chunk (state=2): >>>ansible-tmp-1726773037.721496-8848-51922528233579=/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579 <<< 8848 1726773037.75210: stderr chunk (state=2): >>><<< 8848 1726773037.75221: stdout chunk (state=3): >>><<< 8848 1726773037.75236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773037.721496-8848-51922528233579=/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579 , stderr= 8848 1726773037.75335: variable 'ansible_module_compression' from source: unknown 8848 1726773037.75400: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 8848 1726773037.75439: variable 'ansible_facts' from source: unknown 8848 1726773037.75545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_stat.py 8848 1726773037.76060: Sending initial data 8848 1726773037.76068: Sent initial data (149 bytes) 8848 1726773037.79354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpi8uxoc0f /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_stat.py <<< 8848 1726773037.80950: stderr chunk (state=3): >>><<< 8848 1726773037.80960: stdout chunk (state=3): >>><<< 8848 1726773037.80986: done transferring module to remote 8848 1726773037.80998: _low_level_execute_command(): starting 8848 1726773037.81004: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/ /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_stat.py && sleep 0' 8848 1726773037.83491: stderr chunk (state=2): >>><<< 8848 1726773037.83501: stdout chunk (state=2): >>><<< 8848 1726773037.83516: _low_level_execute_command() done: rc=0, stdout=, stderr= 8848 1726773037.83521: _low_level_execute_command(): starting 8848 1726773037.83527: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_stat.py && sleep 0' 8848 1726773037.99653: stdout chunk (state=2): >>> {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 501219463, "dev": 51713, "nlink": 1, "atime": 1726773035.0224564, "mtime": 1726773035.0474565, "ctime": 1726773035.0474565, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4047422632", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} <<< 8848 1726773038.01036: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8848 1726773038.01050: stdout chunk (state=3): >>><<< 8848 1726773038.01059: stderr chunk (state=3): >>><<< 8848 1726773038.01071: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/etc/tuned/profile_mode", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 0, "gid": 0, "size": 7, "inode": 501219463, "dev": 51713, "nlink": 1, "atime": 1726773035.0224564, "mtime": 1726773035.0474565, "ctime": 1726773035.0474565, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 8, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "root", "gr_name": "root", "checksum": "3ef9f23deed2e23d3ef2b88b842fb882313e15ce", "mimetype": "text/plain", "charset": "us-ascii", "version": "4047422632", "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"path": "/etc/tuned/profile_mode", "follow": false, "get_checksum": true, "checksum_algorithm": "sha1", "get_mime": true, "get_attributes": true}}} , stderr=Shared connection to 10.31.43.7 closed. 8848 1726773038.01114: done with _execute_module (ansible.legacy.stat, {'path': '/etc/tuned/profile_mode', 'follow': False, 'get_checksum': True, 'checksum_algorithm': 'sha1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8848 1726773038.01215: Sending initial data 8848 1726773038.01224: Sent initial data (138 bytes) 8848 1726773038.03929: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpdrqjfmrr /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source <<< 8848 1726773038.04592: stderr chunk (state=3): >>><<< 8848 1726773038.04602: stdout chunk (state=3): >>><<< 8848 1726773038.04626: _low_level_execute_command(): starting 8848 1726773038.04633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/ /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source && sleep 0' 8848 1726773038.07096: stderr chunk (state=2): >>><<< 8848 1726773038.07109: stdout chunk (state=2): >>><<< 8848 1726773038.07125: _low_level_execute_command() done: rc=0, stdout=, stderr= 8848 1726773038.07148: variable 'ansible_module_compression' from source: unknown 8848 1726773038.07188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.copy-ZIP_DEFLATED 8848 1726773038.07208: variable 'ansible_facts' from source: unknown 8848 1726773038.07267: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_copy.py 8848 1726773038.07361: Sending initial data 8848 1726773038.07368: Sent initial data (149 bytes) 8848 1726773038.10029: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpspedfq4u /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_copy.py <<< 8848 1726773038.11491: stderr chunk (state=3): >>><<< 8848 1726773038.11504: stdout chunk (state=3): >>><<< 8848 1726773038.11524: done transferring module to remote 8848 1726773038.11533: _low_level_execute_command(): starting 8848 1726773038.11539: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/ /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_copy.py && sleep 0' 8848 1726773038.13993: stderr chunk (state=2): >>><<< 8848 1726773038.14004: stdout chunk (state=2): >>><<< 8848 1726773038.14020: _low_level_execute_command() done: rc=0, stdout=, stderr= 8848 1726773038.14024: _low_level_execute_command(): starting 8848 1726773038.14030: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/AnsiballZ_copy.py && sleep 0' 8848 1726773038.30237: stdout chunk (state=2): >>> {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source", "_original_basename": "tmpdrqjfmrr", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 8848 1726773038.32059: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8848 1726773038.32072: stdout chunk (state=3): >>><<< 8848 1726773038.32083: stderr chunk (state=3): >>><<< 8848 1726773038.32098: _low_level_execute_command() done: rc=0, stdout= {"dest": "/etc/tuned/profile_mode", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "changed": true, "uid": 0, "gid": 0, "owner": "root", "group": "root", "mode": "0600", "state": "file", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "invocation": {"module_args": {"dest": "/etc/tuned/profile_mode", "mode": "0600", "src": "/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source", "_original_basename": "tmpdrqjfmrr", "follow": false, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "backup": false, "force": true, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8848 1726773038.32133: done with _execute_module (ansible.legacy.copy, {'dest': '/etc/tuned/profile_mode', 'mode': '0600', 'src': '/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source', '_original_basename': 'tmpdrqjfmrr', 'follow': False, 'checksum': '43683f4e92c48be4b00ddd86e011a4f27fcdbeb5', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.copy', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8848 1726773038.32147: _low_level_execute_command(): starting 8848 1726773038.32154: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/ > /dev/null 2>&1 && sleep 0' 8848 1726773038.36417: stderr chunk (state=2): >>><<< 8848 1726773038.36429: stdout chunk (state=2): >>><<< 8848 1726773038.36446: _low_level_execute_command() done: rc=0, stdout=, stderr= 8848 1726773038.36456: handler run complete 8848 1726773038.36483: attempt loop complete, returning result 8848 1726773038.36490: _execute() done 8848 1726773038.36494: dumping result to json 8848 1726773038.36499: done dumping result, returning 8848 1726773038.36506: done running TaskExecutor() for managed_node1/TASK: Set profile_mode to auto [0affffe7-6841-f581-0619-00000000009c] 8848 1726773038.36513: sending task result for task 0affffe7-6841-f581-0619-00000000009c 8848 1726773038.36555: done sending task result for task 0affffe7-6841-f581-0619-00000000009c 8848 1726773038.36559: WORKER PROCESS EXITING changed: [managed_node1] => { "changed": true, "checksum": "43683f4e92c48be4b00ddd86e011a4f27fcdbeb5", "dest": "/etc/tuned/profile_mode", "gid": 0, "group": "root", "md5sum": "451e20aff0f489cd2f7d4d73533aa961", "mode": "0600", "owner": "root", "secontext": "system_u:object_r:tuned_etc_t:s0", "size": 5, "src": "/root/.ansible/tmp/ansible-tmp-1726773037.721496-8848-51922528233579/source", "state": "file", "uid": 0 } 8208 1726773038.37215: no more pending results, returning what we have 8208 1726773038.37218: results queue empty 8208 1726773038.37219: checking for any_errors_fatal 8208 1726773038.37226: done checking for any_errors_fatal 8208 1726773038.37227: checking for max_fail_percentage 8208 1726773038.37228: done checking for max_fail_percentage 8208 1726773038.37229: checking to see if all hosts have failed and the running result is not ok 8208 1726773038.37230: done checking to see if all hosts have failed 8208 1726773038.37230: getting the remaining hosts for this loop 8208 1726773038.37231: done getting the remaining hosts for this loop 8208 1726773038.37235: getting the next task for host managed_node1 8208 1726773038.37241: done getting next task for host managed_node1 8208 1726773038.37243: ^ task is: TASK: Restart tuned 8208 1726773038.37245: ^ state is: HOST STATE: block=2, task=2, rescue=1, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? True, did start at task? False 8208 1726773038.37249: getting variables 8208 1726773038.37250: in VariableManager get_vars() 8208 1726773038.37289: Calling all_inventory to load vars for managed_node1 8208 1726773038.37292: Calling groups_inventory to load vars for managed_node1 8208 1726773038.37294: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773038.37303: Calling all_plugins_play to load vars for managed_node1 8208 1726773038.37305: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773038.37308: Calling groups_plugins_play to load vars for managed_node1 8208 1726773038.37481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773038.37683: done with get_vars() 8208 1726773038.37696: done getting variables 8208 1726773038.37757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restart tuned] *********************************************************** task path: /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.701) 0:00:19.052 **** 8208 1726773038.37792: entering _queue_task() for managed_node1/service 8208 1726773038.38035: worker is 1 (out of 1 available) 8208 1726773038.38049: exiting _queue_task() for managed_node1/service 8208 1726773038.38062: done queuing things up, now waiting for results queue to drain 8208 1726773038.38063: waiting for pending results... 8886 1726773038.39308: running TaskExecutor() for managed_node1/TASK: Restart tuned 8886 1726773038.39435: in run() - task 0affffe7-6841-f581-0619-00000000009d 8886 1726773038.39455: variable 'ansible_search_path' from source: unknown 8886 1726773038.39459: variable 'ansible_search_path' from source: unknown 8886 1726773038.39505: variable '__kernel_settings_services' from source: include_vars 8886 1726773038.39808: variable '__kernel_settings_services' from source: include_vars 8886 1726773038.39868: variable 'omit' from source: magic vars 8886 1726773038.39980: variable 'ansible_host' from source: host vars for 'managed_node1' 8886 1726773038.39994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8886 1726773038.40005: variable 'omit' from source: magic vars 8886 1726773038.40084: variable 'omit' from source: magic vars 8886 1726773038.40124: variable 'omit' from source: magic vars 8886 1726773038.40162: variable 'item' from source: unknown 8886 1726773038.40305: variable 'item' from source: unknown 8886 1726773038.40327: variable 'omit' from source: magic vars 8886 1726773038.40365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 8886 1726773038.40402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 8886 1726773038.40424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 8886 1726773038.40526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8886 1726773038.40539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 8886 1726773038.40568: variable 'inventory_hostname' from source: host vars for 'managed_node1' 8886 1726773038.40574: variable 'ansible_host' from source: host vars for 'managed_node1' 8886 1726773038.40578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8886 1726773038.40675: Set connection var ansible_shell_executable to /bin/sh 8886 1726773038.40680: Set connection var ansible_connection to ssh 8886 1726773038.40688: Set connection var ansible_module_compression to ZIP_DEFLATED 8886 1726773038.40696: Set connection var ansible_timeout to 10 8886 1726773038.40700: Set connection var ansible_shell_type to sh 8886 1726773038.40707: Set connection var ansible_pipelining to False 8886 1726773038.40727: variable 'ansible_shell_executable' from source: unknown 8886 1726773038.40732: variable 'ansible_connection' from source: unknown 8886 1726773038.40735: variable 'ansible_module_compression' from source: unknown 8886 1726773038.40739: variable 'ansible_shell_type' from source: unknown 8886 1726773038.40742: variable 'ansible_shell_executable' from source: unknown 8886 1726773038.40745: variable 'ansible_host' from source: host vars for 'managed_node1' 8886 1726773038.40749: variable 'ansible_pipelining' from source: unknown 8886 1726773038.40751: variable 'ansible_timeout' from source: unknown 8886 1726773038.40755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 8886 1726773038.40876: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 8886 1726773038.40890: variable 'omit' from source: magic vars 8886 1726773038.40896: starting attempt loop 8886 1726773038.40900: running the handler 8886 1726773038.40992: variable 'ansible_facts' from source: unknown 8886 1726773038.41112: _low_level_execute_command(): starting 8886 1726773038.41121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 8886 1726773038.45494: stdout chunk (state=2): >>>/root <<< 8886 1726773038.45508: stderr chunk (state=2): >>><<< 8886 1726773038.45522: stdout chunk (state=3): >>><<< 8886 1726773038.45543: _low_level_execute_command() done: rc=0, stdout=/root , stderr= 8886 1726773038.45563: _low_level_execute_command(): starting 8886 1726773038.45573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346 `" && echo ansible-tmp-1726773038.45555-8886-16609546233346="` echo /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346 `" ) && sleep 0' 8886 1726773038.48693: stdout chunk (state=2): >>>ansible-tmp-1726773038.45555-8886-16609546233346=/root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346 <<< 8886 1726773038.48966: stderr chunk (state=3): >>><<< 8886 1726773038.48976: stdout chunk (state=3): >>><<< 8886 1726773038.48997: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726773038.45555-8886-16609546233346=/root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346 , stderr= 8886 1726773038.49029: variable 'ansible_module_compression' from source: unknown 8886 1726773038.49087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-8208xr2beo8c/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 8886 1726773038.49154: variable 'ansible_facts' from source: unknown 8886 1726773038.49366: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/AnsiballZ_systemd.py 8886 1726773038.50262: Sending initial data 8886 1726773038.50271: Sent initial data (151 bytes) 8886 1726773038.53435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-8208xr2beo8c/tmpbz4a4wes /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/AnsiballZ_systemd.py <<< 8886 1726773038.55951: stderr chunk (state=3): >>><<< 8886 1726773038.55961: stdout chunk (state=3): >>><<< 8886 1726773038.55984: done transferring module to remote 8886 1726773038.55997: _low_level_execute_command(): starting 8886 1726773038.56002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/ /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/AnsiballZ_systemd.py && sleep 0' 8886 1726773038.58666: stderr chunk (state=2): >>><<< 8886 1726773038.58678: stdout chunk (state=2): >>><<< 8886 1726773038.58699: _low_level_execute_command() done: rc=0, stdout=, stderr= 8886 1726773038.58704: _low_level_execute_command(): starting 8886 1726773038.58710: _low_level_execute_command(): executing: /bin/sh -c '/usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/AnsiballZ_systemd.py && sleep 0' 8886 1726773038.86412: stdout chunk (state=2): >>> {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:35 EDT", "WatchdogTimestampMonotonic": "289081170", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9606", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ExecMainStartTimestampMonotonic": "288942483", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9606", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:34 EDT] ; stop_time=[n/a] ; pid=9606 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17100800", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:35 EDT", "StateChangeTimestampMonotonic": "289081173", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveExitTimestampMonotonic": "288942523", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:35 EDT", "ActiveEnterTimestampMonotonic": "289081173", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ActiveExitTimestampMonotonic": "288802426", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveEnterTimestampMonotonic": "288938927", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ConditionTimestampMonotonic": "288941255", "AssertTimestamp": "Thu 2024-09-19 15:10:34 EDT", "AssertTimestampMonotonic": "288941257", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "45783d8000a7410a9ffa885ff5517d4d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 8886 1726773038.87951: stderr chunk (state=3): >>>Shared connection to 10.31.43.7 closed. <<< 8886 1726773038.88000: stderr chunk (state=3): >>><<< 8886 1726773038.88008: stdout chunk (state=3): >>><<< 8886 1726773038.88028: _low_level_execute_command() done: rc=0, stdout= {"name": "tuned", "changed": false, "status": {"Type": "dbus", "Restart": "no", "PIDFile": "/run/tuned/tuned.pid", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "RuntimeMaxUSec": "infinity", "WatchdogUSec": "0", "WatchdogTimestamp": "Thu 2024-09-19 15:10:35 EDT", "WatchdogTimestampMonotonic": "289081170", "PermissionsStartOnly": "no", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "9606", "ControlPID": "0", "BusName": "com.redhat.tuned", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ExecMainStartTimestampMonotonic": "288942483", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9606", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:34 EDT] ; stop_time=[n/a] ; pid=9606 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/tuned.service", "MemoryCurrent": "17100800", "CPUUsageNSec": "[not set]", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "TasksCurrent": "4", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "Delegate": "no", "CPUAccounting": "no", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "AllowedCPUs": "", "AllowedMemoryNodes": "", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22405", "IPAccounting": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "Nice": "0", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "NUMAMask": "", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "AmbientCapabilities": "", "DynamicUser": "no", "RemoveIPC": "no", "MountFlags": "", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "ProtectHome": "no", "ProtectSystem": "no", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "0", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Id": "tuned.service", "Names": "tuned.service", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "WantedBy": "multi-user.target", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "Before": "shutdown.target multi-user.target", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "Description": "Dynamic System Tuning Daemon", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Thu 2024-09-19 15:10:35 EDT", "StateChangeTimestampMonotonic": "289081173", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveExitTimestampMonotonic": "288942523", "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:35 EDT", "ActiveEnterTimestampMonotonic": "289081173", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ActiveExitTimestampMonotonic": "288802426", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveEnterTimestampMonotonic": "288938927", "CanStart": "yes", "CanStop": "yes", "CanReload": "no", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ConditionTimestampMonotonic": "288941255", "AssertTimestamp": "Thu 2024-09-19 15:10:34 EDT", "AssertTimestampMonotonic": "288941257", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "45783d8000a7410a9ffa885ff5517d4d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "tuned", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=Shared connection to 10.31.43.7 closed. 8886 1726773038.88137: done with _execute_module (ansible.legacy.systemd, {'name': 'tuned', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.16.11', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/', '_ansible_remote_tmp': '~/.ansible/tmp'}) 8886 1726773038.88157: _low_level_execute_command(): starting 8886 1726773038.88164: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726773038.45555-8886-16609546233346/ > /dev/null 2>&1 && sleep 0' 8886 1726773038.90667: stderr chunk (state=2): >>><<< 8886 1726773038.90679: stdout chunk (state=2): >>><<< 8886 1726773038.90696: _low_level_execute_command() done: rc=0, stdout=, stderr= 8886 1726773038.90704: handler run complete 8886 1726773038.90738: attempt loop complete, returning result 8886 1726773038.90756: variable 'item' from source: unknown 8886 1726773038.90823: variable 'item' from source: unknown ok: [managed_node1] => (item=tuned) => { "ansible_loop_var": "item", "changed": false, "enabled": true, "item": "tuned", "name": "tuned", "state": "started", "status": { "ActiveEnterTimestamp": "Thu 2024-09-19 15:10:35 EDT", "ActiveEnterTimestampMonotonic": "289081173", "ActiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ActiveExitTimestampMonotonic": "288802426", "ActiveState": "active", "After": "network.target basic.target system.slice systemd-sysctl.service dbus.service dbus.socket sysinit.target systemd-journald.socket polkit.service", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "yes", "AssertTimestamp": "Thu 2024-09-19 15:10:34 EDT", "AssertTimestampMonotonic": "288941257", "Before": "shutdown.target multi-user.target", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "BusName": "com.redhat.tuned", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "yes", "ConditionTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ConditionTimestampMonotonic": "288941255", "ConfigurationDirectoryMode": "0755", "Conflicts": "power-profiles-daemon.service shutdown.target cpupower.service auto-cpufreq.service tlp.service", "ControlGroup": "/system.slice/tuned.service", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Dynamic System Tuning Daemon", "DevicePolicy": "auto", "Documentation": "man:tuned(8) man:tuned.conf(5) man:tuned-adm(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "9606", "ExecMainStartTimestamp": "Thu 2024-09-19 15:10:34 EDT", "ExecMainStartTimestampMonotonic": "288942483", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/tuned ; argv[]=/usr/sbin/tuned -l -P ; ignore_errors=no ; start_time=[Thu 2024-09-19 15:10:34 EDT] ; stop_time=[n/a] ; pid=9606 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/usr/lib/systemd/system/tuned.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "tuned.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveEnterTimestampMonotonic": "288938927", "InactiveExitTimestamp": "Thu 2024-09-19 15:10:34 EDT", "InactiveExitTimestampMonotonic": "288942523", "InvocationID": "45783d8000a7410a9ffa885ff5517d4d", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "9606", "MemoryAccounting": "yes", "MemoryCurrent": "17100800", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "tuned.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PIDFile": "/run/tuned/tuned.pid", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Requires": "system.slice dbus.service sysinit.target dbus.socket", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system.slice", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2024-09-19 15:10:35 EDT", "StateChangeTimestampMonotonic": "289081173", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "running", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "4", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "Type": "dbus", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "enabled", "UnitFileState": "enabled", "UtmpMode": "init", "WantedBy": "multi-user.target", "WatchdogTimestamp": "Thu 2024-09-19 15:10:35 EDT", "WatchdogTimestampMonotonic": "289081170", "WatchdogUSec": "0" } } 8886 1726773038.90926: dumping result to json 8886 1726773038.90944: done dumping result, returning 8886 1726773038.90951: done running TaskExecutor() for managed_node1/TASK: Restart tuned [0affffe7-6841-f581-0619-00000000009d] 8886 1726773038.90957: sending task result for task 0affffe7-6841-f581-0619-00000000009d 8886 1726773038.91073: done sending task result for task 0affffe7-6841-f581-0619-00000000009d 8886 1726773038.91078: WORKER PROCESS EXITING 8208 1726773038.91422: no more pending results, returning what we have 8208 1726773038.91424: results queue empty 8208 1726773038.91425: checking for any_errors_fatal 8208 1726773038.91428: done checking for any_errors_fatal 8208 1726773038.91429: checking for max_fail_percentage 8208 1726773038.91429: done checking for max_fail_percentage 8208 1726773038.91430: checking to see if all hosts have failed and the running result is not ok 8208 1726773038.91430: done checking to see if all hosts have failed 8208 1726773038.91430: getting the remaining hosts for this loop 8208 1726773038.91431: done getting the remaining hosts for this loop 8208 1726773038.91434: getting the next task for host managed_node1 8208 1726773038.91439: done getting next task for host managed_node1 8208 1726773038.91440: ^ task is: TASK: meta (flush_handlers) 8208 1726773038.91441: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773038.91444: getting variables 8208 1726773038.91445: in VariableManager get_vars() 8208 1726773038.91467: Calling all_inventory to load vars for managed_node1 8208 1726773038.91470: Calling groups_inventory to load vars for managed_node1 8208 1726773038.91471: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773038.91478: Calling all_plugins_play to load vars for managed_node1 8208 1726773038.91480: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773038.91482: Calling groups_plugins_play to load vars for managed_node1 8208 1726773038.91598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773038.91713: done with get_vars() 8208 1726773038.91721: done getting variables 8208 1726773038.91768: in VariableManager get_vars() 8208 1726773038.91776: Calling all_inventory to load vars for managed_node1 8208 1726773038.91777: Calling groups_inventory to load vars for managed_node1 8208 1726773038.91778: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773038.91781: Calling all_plugins_play to load vars for managed_node1 8208 1726773038.91782: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773038.91784: Calling groups_plugins_play to load vars for managed_node1 8208 1726773038.91861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773038.91966: done with get_vars() 8208 1726773038.91976: done queuing things up, now waiting for results queue to drain 8208 1726773038.91977: results queue empty 8208 1726773038.91978: checking for any_errors_fatal 8208 1726773038.91982: done checking for any_errors_fatal 8208 1726773038.91982: checking for max_fail_percentage 8208 1726773038.91983: done checking for max_fail_percentage 8208 1726773038.91983: checking to see if all hosts have failed and the running result is not ok 8208 1726773038.91984: done checking to see if all hosts have failed 8208 1726773038.91984: getting the remaining hosts for this loop 8208 1726773038.91984: done getting the remaining hosts for this loop 8208 1726773038.91988: getting the next task for host managed_node1 8208 1726773038.91991: done getting next task for host managed_node1 8208 1726773038.91992: ^ task is: TASK: meta (flush_handlers) 8208 1726773038.91993: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773038.91995: getting variables 8208 1726773038.91996: in VariableManager get_vars() 8208 1726773038.92002: Calling all_inventory to load vars for managed_node1 8208 1726773038.92003: Calling groups_inventory to load vars for managed_node1 8208 1726773038.92004: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773038.92007: Calling all_plugins_play to load vars for managed_node1 8208 1726773038.92008: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773038.92010: Calling groups_plugins_play to load vars for managed_node1 8208 1726773038.92088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773038.92191: done with get_vars() 8208 1726773038.92197: done getting variables 8208 1726773038.92226: in VariableManager get_vars() 8208 1726773038.92234: Calling all_inventory to load vars for managed_node1 8208 1726773038.92235: Calling groups_inventory to load vars for managed_node1 8208 1726773038.92237: Calling all_plugins_inventory to load vars for managed_node1 8208 1726773038.92239: Calling all_plugins_play to load vars for managed_node1 8208 1726773038.92240: Calling groups_plugins_inventory to load vars for managed_node1 8208 1726773038.92242: Calling groups_plugins_play to load vars for managed_node1 8208 1726773038.92317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 8208 1726773038.92422: done with get_vars() 8208 1726773038.92430: done queuing things up, now waiting for results queue to drain 8208 1726773038.92431: results queue empty 8208 1726773038.92432: checking for any_errors_fatal 8208 1726773038.92433: done checking for any_errors_fatal 8208 1726773038.92434: checking for max_fail_percentage 8208 1726773038.92434: done checking for max_fail_percentage 8208 1726773038.92435: checking to see if all hosts have failed and the running result is not ok 8208 1726773038.92435: done checking to see if all hosts have failed 8208 1726773038.92436: getting the remaining hosts for this loop 8208 1726773038.92436: done getting the remaining hosts for this loop 8208 1726773038.92438: getting the next task for host managed_node1 8208 1726773038.92440: done getting next task for host managed_node1 8208 1726773038.92441: ^ task is: None 8208 1726773038.92442: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 8208 1726773038.92442: done queuing things up, now waiting for results queue to drain 8208 1726773038.92443: results queue empty 8208 1726773038.92443: checking for any_errors_fatal 8208 1726773038.92443: done checking for any_errors_fatal 8208 1726773038.92444: checking for max_fail_percentage 8208 1726773038.92444: done checking for max_fail_percentage 8208 1726773038.92444: checking to see if all hosts have failed and the running result is not ok 8208 1726773038.92445: done checking to see if all hosts have failed 8208 1726773038.92446: getting the next task for host managed_node1 8208 1726773038.92448: done getting next task for host managed_node1 8208 1726773038.92449: ^ task is: None 8208 1726773038.92449: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=33 changed=8 unreachable=0 failed=0 skipped=8 rescued=1 ignored=1 Thursday 19 September 2024 15:10:38 -0400 (0:00:00.547) 0:00:19.599 **** =============================================================================== fedora.linux_system_roles.kernel_settings : Ensure required packages are installed --- 5.69s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:12 Gathering Facts --------------------------------------------------------- 2.34s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tests_bool_not_allowed.yml:2 fedora.linux_system_roles.kernel_settings : Restart tuned to apply active profile, mode changes --- 0.79s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:149 fedora.linux_system_roles.kernel_settings : Ensure kernel_settings is in active_profile --- 0.79s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:91 fedora.linux_system_roles.kernel_settings : Ensure required services are enabled and started --- 0.77s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:67 fedora.linux_system_roles.kernel_settings : Set profile_mode to manual --- 0.77s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:99 fedora.linux_system_roles.kernel_settings : Apply kernel settings ------- 0.74s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:112 Ensure kernel_settings is not in active_profile ------------------------- 0.72s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:46 Set profile_mode to auto ------------------------------------------------ 0.70s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:57 fedora.linux_system_roles.kernel_settings : Find tuned profile parent directory --- 0.68s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:50 fedora.linux_system_roles.kernel_settings : Check that settings are applied correctly --- 0.56s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/verify_settings.yml:2 Restart tuned ----------------------------------------------------------- 0.55s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:64 fedora.linux_system_roles.kernel_settings : Get active_profile ---------- 0.51s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:80 fedora.linux_system_roles.kernel_settings : Ensure kernel settings profile directory exists --- 0.47s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:74 fedora.linux_system_roles.kernel_settings : Read tuned main config ------ 0.43s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:42 fedora.linux_system_roles.kernel_settings : Check if system is ostree --- 0.42s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/set_vars.yml:10 Remove kernel_settings tuned profile ------------------------------------ 0.37s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:36 Get active_profile ------------------------------------------------------ 0.37s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:41 fedora.linux_system_roles.kernel_settings : Get current config ---------- 0.35s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/roles/kernel_settings/tasks/main.yml:107 Verify no settings ------------------------------------------------------ 0.34s /tmp/collections-EI7/ansible_collections/fedora/linux_system_roles/tests/kernel_settings/tasks/cleanup.yml:20 8208 1726773038.92523: RUNNING CLEANUP